Navigating Regulatory Literacy for Community Managers
At the May 2024 All Things in Moderation conference, researcher Dr. Fiona Martin and CEO of Modulate, Mike Pappas, discussed the importance of regulatory literacy for frontline online community and moderation practitioners. Their fireside chat delved into the shifting sands of managing online communities in a landscape increasingly shaped by regulatory requirements.
Understanding Regulatory Literacy
Moderators are micro-regulators, with growing responsibilities to understand and enforce legislation and regulations around hate speech, misinformation, defamation and more. In countries like Australia, moderators are expected to have a high degree of regulatory literacy, and in many cases, are the resident 'expert' around digital regulatory issues for digital social settings.
Regulatory literacy is the awareness and understanding of the laws and regulations that govern online platforms and communities. This literacy goes beyond knowing the written laws; it includes understanding enforcement practices, case studies, and best practices. For community managers and moderators, regulatory literacy is critical. It enables them to navigate the complexities of moderating content, managing user behavior, and ensuring compliance with safety and privacy regulations so their communities can sustain and thrive.
The Unique Challenges of Moderating Gaming Communities
Gaming communities present distinct challenges for moderators, particularly because of their highly interactive and emotionally charged nature. Gaming is not just about competition; it’s a social space where deeper relationships are formed and complex interactions play out. While this can be beneficial, it also opens doors for risks and harms, such as grooming or radicalisation. Tools like Modulate’s ToxMod, which uses voice analysis to detect patterns of harm, are becoming increasingly important in identifying and mitigating these risks, supported by community experts who can offer critical context.
The Role of AI in Moderation
One of the standout topics in this fireside chat was the role of AI in moderating online communities. Modulate’s ToxMod, for example, doesn’t just flag explicit violations like hate speech but also analyses subtler cues in voice interactions that might indicate more sinister activities. This advanced level of moderation is crucial, especially in gaming, where users, including minors, interact in complex social environments.
The Impact of Regulation on Community Management
As regulations like the EU’s Digital Services Act (DSA) and Australia’s Online Safety Act (OSA) come into play, platforms are increasingly required to proactively manage harmful content. While safety regulations are evolving, they are still relatively new compared to privacy laws, which have been the primary focus for many platforms and digital social spaces. The conversation also touched on the challenges of applying regulations designed for social media to community platforms, which operate under different dynamics.
The Human Element in Moderation
Despite the advancements in AI, human moderators remain essential. They bring a level of empathy and understanding that AI cannot replicate. Balancing AI tools with human oversight and contextual counsel, ensures that moderation practices are fair, effective, and adaptable to the diverse behaviors of online users.
Key Takeaways for Community Managers:
Stay Informed: Regularly update your understanding of regulation and legislation in your region, and how they impact your communities, platforms and industry sectors. This is crucial for ensuring compliance and protecting your community. Work with industry bodies (like ACM) to help you get on top of your responsibilities - it’s risky to set and forget, as this space is evolving rapidly on both a local and global scale.
Leverage AI, But Don’t Rely Solely on It: AI tools like ToxMod are powerful, but they should complement, never replace, human moderators who can provide nuanced judgment. Removing humans is a recipe for risk and crisis. Ensuring that your tools are able to be configured for your social, cultural and regulatory needs.
Focus on User Experience: Regulatory literacy isn’t just about compliance; it’s about creating a safer and more positive user experience. Understanding the needs and behaviors of your community is essential in implementing effective moderation strategies and cultures to support them.
Advocate for Clear Communication: Work towards establishing a shared vocabulary and consistent moderation practices within your platform and across the industry. While there will always be some differences, even disagreements, a shared playbook will be essential as the regulatory space grows more interconnected across the world.
Engage with Regulators: Community managers and moderators need to play a more active role in influencing regulatory discussions. Frontline experience is invaluable in shaping realistic and effective policies that serve communities.
Keep up to date with the annual All Things in Moderation conference