Content Moderation
Wayfinding: Spotlights > Audience Onboarding > Content Moderation
comes from early internet issues associated with harmful posts on forums, chatrooms, or other platforms. The debate over censorship versus moderation has raged for decades, but this is also an important consideration with the audience onboarding process. While Brendan Bradley has facilitated community-building audience experiences that bring out the good in his viewers, not everyone can expect to be so fortunate. Nathan Leigh told me, while designing the interaction for POV: you are an a.i. achieving consciousness:
I realized that, if the way that we interact is that I can see what you're drawing on the screen, and that's transmitted to me in real time, then somebody could draw a swastika, and everybody could see it. And while it is not likely that that would happen in our self-selecting group of artistic weirdos who are interested in seeing experimental digital theatre, it really takes that happening once to ruin it for everybody. So that also becomes part of the guardrails, and that is the thing that I think in-person theatre largely solved by instituting ‘sit down, shut up and be polite’. And I'm like, ‘Okay, how much agency can I give you so that you can ruin your own experience if you want to, but it's hard?’ You have to be really, really invested in having a bad time.
Leigh further told me that he tested his approach to content moderation with his teenage nephew, asking the teen to draw obscene images as fast as possible, while Leigh deleted them as fast as possible. For Leigh, testing the process satisfied him that he could moderate the content of the production, if necessary. There are other tools that can be implemented: for example, Twitch has developed an option called ‘AutoMod’ that requires comments to receive approval before they enter the chat, and a digital theatre’s stage manager or house manager can use this for televisual shows.
Last updated