Navigating the Youth Social Media Ban

In November 2024, the Australian Parliament enacted the Online Safety Amendment (Social Media Minimum Age) Bill 2024, prohibiting individuals under 16 from accessing major social media platforms such as TikTok, Facebook, Snapchat, Reddit, X (formerly Twitter), and Instagram. Social media companies are granted a one-year period to implement effective age verification measures, with potential fines of up to 50 million Australian dollars (approximately 33 million USD) for non-compliance.

ACM Director Venessa Paech hosted a Q & A with Dr. Tama Leaver, Professor of Internet Studies at Curtin University in Perth, WA, and Chief Investigator in the ARC Centre of Excellence for the Digital Child, to explore the realities of the Bill and some potential impacts for online community managers and moderators.

Understanding the Ban

Dr. Leaver summarised the intent and mechanics of the legislation as it stands:

  • Platforms must verify users' ages to ensure they are at least 16 years old.

  • Government ID cannot be the sole verification method; alternatives (e.g., biometrics) must also be offered.

  • The legislation is vague about which platforms are included, leaving final determinations to the Communications Minister of the day.

Critically, the law - which is an amendment to the Online Safety Act Australia introduced in 2021, doesn’t prevent access to content but account-based use.

Dr. Leaver highlighted the key concerns raised during the unusually speedy process of the Bill’s creation and passing. process:

  1. Privacy and Data Security: Implementing robust age verification will likely necessitate collecting sensitive personal information, including biometric data or government-issued IDs, raising concerns about data privacy and potential misuse (especially for vulnerable groups).

  2. Accessibility of Support Resources: Critics argue that the ban could restrict access to online communities and support networks, particularly for marginalised youth, such as LGBTQ+ individuals, who rely on these platforms for connection and assistance.

  3. Enforcement Challenges: Ensuring compliance poses significant challenges, as tech-savvy young people may find ways to circumvent restrictions.

  4. False Sense of Security: Leaver warned that this regulation might lead parents to wrongly assume young users are safe online. This misplaced trust could inadvertently drive youth to riskier online behaviours, with less parental oversight or guiding conversation (if parents think it’s ‘taken care of already’).

  5. Exposure to Darker Elements: Shut out of mainstream social media spaces, young people may be drawn into less regulated, less moderated and more harmful communities and corners of the internet.

  6. Politicisation: As specific platforms subject to the ban are at the discretion of each government, the ban is at risk of politicisation. A minister might add online community spaces or platforms they object to on ideological grounds. This is a real risk given the global political landscape.

Implications for Online Communities

Impact on Specific Platforms and Users

  • Larger platforms like Meta, TikTok, Reddit and Snapchat are likely covered. Smaller platforms and forums remain in a gray area, though they may fall under the scope of the ban.

  • Exemptions may be granted to spaces with demonstrated robust moderation, such as mental health communities. Articulating your governance policies and showcasing responsible moderation can help communities make a case for exemption.

  • We can expect shifting digital demographics in Australia, and may see a splintering or further fragmentation of young user groups (e.g. those who have the skills to circumvent the ban, and those who don’t) and their experience of the internet.

Potential Benefits

  • A shift to smaller, intimate and more robustly community managed platforms could foster healthier online environments.

  • The ban might accelerate digital literacy efforts, preparing young users for safer social media engagement at 16.

  • Community managers could play a pivotal role as "digital citizenry stewards," ensuring education and best practices within online spaces.

Opportunities for Advocacy and Preparedness

Digital Literacy

Dr. Leaver emphasised the need for robust digital literacy programs to equip young users. Australia has struggled to implement effective and holistic digital literacy programs historically, and they will be needed more than ever in the wake of the under 16 ban.

"The ban creates an opportunity to push for better national curriculum standards for media and digital literacy," Leaver argued.

Community managers, with their experience navigating risk and governance, are uniquely positioned to support these efforts. By collaborating with schools, government programs, and other educational initiatives, community and moderation professionals can provide practical, real-world insights to help young users become responsible digital citizens, and advise platforms on improving their governance efforts.

"Community managers are already on the digital frontline, and their expertise in fostering safe and informed online participation is invaluable," Paech added.

Diversifying Community Ecosystems

Paech encouraged managers to reduce reliance on major platforms, citing past disruptions like Facebook’s localised news ban in Australia as a ‘negotiating’ tactic with the federal government.

"Build ecosystems with spaces you control to mitigate exposure to regulatory volatility," she advised.

Articulating Value

Both Leaver and Paech tressed the importance of showcasing the value of managed communities, especially in light of an increasingly volatile global regulatory landscape.

"Managed communities already align with what the government wants: safe, moderated spaces. Articulating this can protect against overreach," Leaver stated.

What next?

All community managers in Australia - regardless of the communities or spaces that they are building or overseeing, can start:

  • Thinking about and preparing the case for your community to earn an exception and remain accessible to young people (demonstrating need, moderation robustness, etc.);

  • Assessing onboarding processes for potential risks around younger users;

  • Engaging with policymakers to advocate for clear guidelines and the inclusion of community management perspectives in online safety conversations and regulation;

  • Preparing for shifts in user demographics and behaviours, and adapting their strategies to maintain safe, inclusive spaces.

As Paech concluded: "Our work is more important than ever. Stay engaged, stay informed, and continue creating meaningful online spaces."

For ongoing updates and support, follow Australian Community Managers (ACM). For assistance with your governance and moderation planning, drop us a line.

Next
Next

New Study: Benchmarking Communities of Support