Social Media Code Adopted

Australia’s eSafety Commissioner has registered the new Social Media Services Code, under the Online Safety Act 2021.

After several years of development and consultation, the Social Media Services Code, along with Codes for Internet Carriage Services, App Distribution Services, Hosting Services, and Equipment have been approved and will come into effect six months from the date of registration (approximately January 2024).

Download the approved Code as a PDF or Word document

The Codes operate under Australia’s Online Safety Act 2021 (OSA) and require industry actors to take adequate steps to reduce the availability of seriously harmful online content, such as child sexual abuse material (CSAM) and pro-terror material.

The new Social Media Services Code has been registered and will come into effect at the end of 2023.

These steps include:

  • ensuring user reporting mechanisms are in place

  • timely responses to reported issues or concerns

  • appropriate detection and filtering tools to moderate high-risk materials such as CSAM.

eSafety will be able to receive complaints and investigate potential breaches. An industry code or standard will be backed up by powers to ensure compliance including injunctions, enforceable undertakings, and maximum financial penalties of nearly $700,000 per day for  continuing breaches.

The Codes and industry standards apply equally to Australian and overseas providers where their services are provided to to users in Australia.  

Impacts for online community managers and moderators

The Social Media Services Code is primarily directed at operators of social media platforms. The Code defines a Social Media Service as where:

“the sole or primary purpose of the service is to enable online social interaction between 2 or more end‑users; the service allows end‑users to link to, or interact with, some or all other end‑users; the service allows end‑users to post material on the service".”

This includes major social networks, public media sharing networks, discussion forums, consumer review networks, and internal social networks.

If you a community, social media or moderation professional, check that your platform providers are complying with the Code and the OSA. This is especially important if you are using a newer platform that may be unaware of Australia’s regulatory requirements. If you manage or moderate within a platform that is non-compliant, you are risking impacts to the health of your users and community, as well as risking downtime (or even closure) if the platform is found in breach of the Code.

If you are a platform vendor supporting this space, it’s essential to stay across the Code, and ensure you design for safety as a first priority, not an afterthought - providing robust moderation and governance features for moderators and community managers using your system.

The Code affirms the importance of digital governance and the role of moderation checks and balances as vital to the health of online social spaces.

Watch the ACM webinar on the Online Safety Act and its implications for digital front line workers

Apps, storage and search rejected for now

The eSafety Commissioner chose not to register the Designated Internet Services (DIS) code, covering apps, websites, and file and photo storage services like Apple iCloud and Microsoft One Drive; and the Relevant Electronic Services (RES) code, covering dating sites, online games and instant messaging, citing a failure of the codes to provide appropriate community safeguards, and a lack of commitment to Safety by Design protocols and processes.

eSafety will now develop mandatory, enforceable industry standards for these two areas (Relevant Electronic Services and Designated Internet Services).

The eSafety Commissioner reserved her decision on draft Search Engines code, to allow time to update it to incorporate emerging risks of generative AI.

The draft industry codes submitted to eSafety on 31 March can be found at onlinesafety.org.au/codes

Venessa Paech

Venessa is Director and Co-Founder of ACM.

http://www.venessapaech.com
Previous
Previous

Child safety awareness for moderators

Next
Next

Measuring Moderation