Building a moderation movement

Sarah T. Roberts, moderation advocate and expert on digital labour, social media policy, and internet cultures.

Marginalised and hidden away, moderators are constrained by extractive practices and corporate profit motives in their workplaces. So how do we go about raising the status of this work in the commercial and public spheres?

Sarah T. Roberts is an associate professor at UCLA, specializing in Internet and social media policy, infrastructure, politics and culture. She argued for a visible moderation movement at the inaugral All Things in Moderation conference.

Why is moderation marginalised?

  • Structural issues: Large-scale commercial content moderation was implemented on most major platforms reactively, not proactively. Budgets and long-term strategies were therefore not created with moderation in mind. As a result, platforms have been reluctant to dedicate time and money to moderators, and moderation remains chronically under-resourced.

  • Mystification and dehumanisation: Moderation processes have been obfuscated on many major platforms, leading to the misconception among the general public that computers carried out moderation. Mystifying moderation enabled platforms to make decisions that were business-focused rather than community-focused, protecting their profit motives. However, this trend has lead to the people and practices behind moderation remaining relatively invisible.

Sarah discusses the current state of community platforms.

What should moderation look like?

Early online communities were founded on a culture of participatory governance. Moderators were visible leaders in these communities: they had status that denoted their admin responsibilities as well as technological functionalities to support their role. Above all, they participated as members of these communities, respected for their role as subject matter experts or users with social capital.

While recent platforms like Discord and Reddit reflect these communities, this slow social media model is not necessarily scaleable. However, scaling down is not necessarily a bad thing: taking time to curate communities can drive social impact, and must be considered as we think about alternative options for online community platforms.

Sarah emphasises the importance of slow, localized platforms in driving social impact.

How can we build a more visible moderation movement?

  • Take a clear ethical stance: If you don't have take a stand on what constitutes acceptable and ethical behaviour in your community, you risk courting extremists. We need to be having these conversations about ethics loudly, making it a priority to include more diverse people.

  • Start talking about the state of moderation: For too long, conversations about moderation have happened within the constraints of computational logics and profit motives, antithetical to the principles of community. This model severely constrains moderators who are trying to practice human skills within their communities and have conversations about how to move away from extractive work practices.

  • Create clear, organising principles for moderators. Large platforms often do not articulate their guiding values. In the absence of such values, moderators have no principles to fall back on, making it difficult to create cultural and social impact in their communities.

  • Think critically about terms like "neutrality" and "objectivity". Political neutrality is often lauded as a goal of large-scale moderation, but it may not be feasible. We must consider: what does neutrality mean? Are we acknowledging power imbalances in our definitions of neutrality, and how does this stance effect marginalised communities, whose existence is already politicised?

  • Look at alternatives to current platforms. Recent additions like Mastodon are a good start, but we need a plethora of alternatives to platforms based on corporate profit motives. Communities that decouple from corporate control and return to carefully curated, principled, and ethically sound communities are sorely needed.



All Things in Moderation ran online 11-12 May 2023 and featured over 25 expert contributors from around the world and across practitioner, academic and policy disciplines.



Previous
Previous

Everything in Moderation: Ben Whitelaw

Next
Next

How to manage burnout & vicarious trauma