Moderating Australia's Indigenous Voice Referendum

At the 2024 All Things in Moderation conference we explored the complex challenges of moderating Australia's Indigenous Voice Referendum.

On 14 October 2023, Australians voted in a national referendum about whether to change the Constitution to recognise the First Peoples of Australia by establishing a body called the Aboriginal and Torres Strait Islander Voice. It was the first referendum of the 21st century - and was ultimately defeated in a No vote.

The process generated a significant rise in racism, hate speech and other online harms, which moderators and community managers were then forced to negotiate on behalf of their users, organisations and themselves.

Dr. Jennifer Beckett led the conversation about unique challenges moderating The Voice debate.


We gathered key professionals who oversaw those front lines to explore what lessons we can learn from the referendum and ensuing discourse.

First Nations moderators and community managers at The Guardian Australian and the Australian Broadcasting Corporation were actively consulted in the preparation for this discussion, and were invited to participate in/co-lead the conversation. They wanted the conversation to occur, but did not want to join the panel and risk re-traumatisation, which we fully respect.

Moderator: ACM member Dr. Jennifer Beckett, a social media researcher from the University of Melbourne.

Kathryn Perrott, Social Media Specialist from the Australian Broadcasting Corporation (ABC) was at the forefront of moderating discussions on The Voice and similarly sensitive and topics within the ABC’s social media channels,

Dave Earley is the Audience Editor at The Guardian Australia, and oversaw the moderation of comments and fostering meaningful engagement around The Voice on the publisher’s owned platform.

Dan de Sousa is a Senior Community Manager at social media agency Quiip, who led moderation for Special Broadcasting Service (SBS) and National Indigenous Television (NITV) during this period across multiple social media platforms.


Primary Challenges:

  • Elevated levels of racism and hate speech across diverse online groups and communities

  • Proliferation of misinformation and disinformation, including organised propaganda and bad actor networks

  • Increased emotional and psychological strain on moderators, particularly First Nations moderators and those in structurally marginalised situations.


Guardian Australia Audience Editor Dave Earley: Years of moderation ground work gave the publication a head-start for The Voice.

Moderator Well-being and Preparedness

The panelists all stressed the importance of centring wellbeing of digital front line workers during the referendum period - in both formal and informal ways. They underscored that without thoughtful pre-work, no amount of reactivity would have been able to adequately support moderators. This involves having hard discussions ahead of time, and honestly auditing your systems - technical and social - for gaps or issues.

Dave Earley (Guardian Australia) explained that years of robust work around moderation protocols and processes left The Guardian Australia in a particularly strong position to face The Voice headwinds.

Kathryn Perrott (ABC): "Our approach prioritised the well-being of moderators through comprehensive pre-event sessions and continuous support during the referendum. We implemented dedicated chat channels, enabling moderators to stay connected and share their experiences, thereby fostering a supportive environment."

Dan de Sousa (Quiip): "Ensuring access to counselling and allowing moderators to take breaks as needed were critical elements of our strategy. By setting up specific queues for high-risk and racist content, we managed the workload more effectively."

Key Strategies:

  • Comprehensive pre-event planning and support

  • Continuous communication channels for real-time support

  • Access to mental health resources and flexible break policies.

Structured and Proactive Moderation

Dave Early (Guardian Australia): "Our strategy included pre-moderation on all Voice-related articles, guided by our Indigenous Affairs editor. This proactive approach allowed us to filter out harmful comments before they reached the public, maintaining a healthier discourse in line with our community norms and standards.”

Kathryn Perrott (ABC): "We created specific queues for racist content and equipped moderators with pre-prepared responses to counter common misinformation threads. This enabled us to address issues swiftly and efficiently."

Dan de Sousa (Quiip): "Transitioning to Sprout Social was a game-changer, as it provided enhanced control over moderation through keyword filtering and specific inboxes for different types of content."

Key Strategies:

  • Implementation of pre-moderation with expert guidance

  • Creation of targeted queues for sensitive content

  • Adoption of advanced moderation tools for enhanced control.

Handling Misinformation and Disinformation

All panelists spoke to the surge in mis-and disinformation, and the need to bolster strategies to combat it through the heat of the referendum debate. This includes being hyper-aware of strategies and techniques that bad actors may employ in their attempts to subvert existing buffers to harmful behaviour, and any patterns in user activity.

For example, each saw a trend in copy-paste arguments and attacks from external voices (such as U.S right-wing groups), and concerted efforts to introduce misinformation by stealth.

Kathryn Perrott, Social Media Specialist at the Australian Broadcasting Corporation discussing an increase in U.S right-wing posts in Australian online groups about The Voice referendum.

Kathryn Perrott (ABC): "We developed prepared responses and referral plans to address common misinformation. By feeding back audience questions and trending topics to our editorial teams, we ensured accurate and timely content was published."

Dave Early (Guardian Australia): "Our use of fact-checking articles and pinned comments was able to counter a fair bit of misinformation. Pre-moderation allowed us to prevent the spread of false information from the outset."

Dan de Sousa (Quiip): "While managing the high volume of comments was challenging, feeding misinformation back to editorial teams enabled us to publish accurate information promptly. Engaging directly with misinformation in comments can sometimes turn the narrative positively."

Key Strategies:

  • Development of prepared responses and editorial feedback loops

  • Utilisation of fact-checking articles and pinned comments

  • Direct engagement with misinformation when feasible.

Take the on-demand Australian Community Managers Moderating Misinformation short course

Empowering Moderators

Panelists spoke about the need to remove excessive layers of permission and control during a crisis or heated time. If moderators or community managers need to move through too many steps while a serious issue is unfolding, this increases exposure to harm for both them and other community members. Additionally, it can contribute to a feeling of disempowerment that negatively impacts the resilience they need to navigate the issue.

Kathryn Perrott (ABC): "Empowering moderators to make decisions about comment sections and providing them with the agency to manage their work was crucial in maintaining a safe environment. This approach fostered a sense of control and responsibility among the moderators."

Key Takeaway:

  • Grant moderators the authority to make real-time decisions and adapt responsibly in the moment - enhancing their sense of agency and control.

Dan de Sousa (Quiip): “Consistent community management is one of the best mitigators for online harms. You can’t introduce it after the fact.”

Advanced Training and Community Engagement

Panelists each spoke about the importance of training for moderators of all kinds (including volunteers who may be taking on moderation tasks) when heading into a knowingly sensitive or contentious area of engagement.

Specifically, they recommended upskilling in moderating misinformation (which can agitate or undermine healthy governance efforts), and the nuances of hate speech (including the micro-aggressions that slowly build permission for its expression).

They also pointed to the effects of stable community management as a mitigant - if you have built a culture of constructive, pro-social norms, where users are willing and able to support governance, that culture is an inoculate against anti-social behaviour, and will ultimately be more resilient if issues arise.

Dan de Sousa (Quiip): "Training moderators to handle misinformation and fostering a culture where community members feel empowered to self-moderate are essential. Engaging with the community and setting clear expectations can significantly mitigate the spread of false narratives."

Key Takeaways:

  • Invest in specialised issue training for moderators

  • Ensure consistent pro-social community management (before you need it)

  • Empower community members to engage in self-moderation.

Global Playbooks and Cross-Regional Collaboration

Dave Earley (Guardian Australia): "Developing global playbooks for moderating sensitive content and sharing these across regions can provide a consistent and informed approach. This ensures moderators are well-prepared for local sensitivities and cultural context."

Key Takeaway:

  • Create and disseminate global playbooks to handle region-specific issues effectively, leveraging local expertise.

Addressing the Ongoing Effects of Hate Speech

The panel also highlighted the lasting impact of hate speech and online harms on First Nations community members. For many in Australia, the debate around The Voice has passed. But for plenty of others (especially First Nations community members or staff), trauma can extend well beyond the initial incident.

Australia community managers and moderators should look to maintain ongoing support systems and ensure that anyone looking after a digital social space can ecognise and address flow-on harms appropriately.

Key Takeaway:

  • Continuously support First Nations communities and ensure moderators are trained to handle the ongoing effects of hate speech and online harm.

More:

All Things in Moderation ran online 16-17 May 2024 and featured over 25 expert contributors from around the world and across practitioner, academic and policy disciplines.

The conference will be back in 2025 - stay tuned for early bird ticket updates!

Venessa Paech

Venessa is Director and Co-Founder of ACM.

http://www.venessapaech.com
Previous
Previous

Creating Safety When People Are In Crisis

Next
Next

Embracing Neurodiversity in Community Management