Online Safety Act: what it means for community managers

Julie Inman Grant, Australia's eSafety Commissioner

Julie Inman Grant, Australia's eSafety Commissioner

In January 2022 a new Online Safety Act will come into effect for Australian users of the internet, and any businesses or providers that service those users.

It will mean some new responsibilities for community professionals, and new requirements for anyone creating new social platforms or technologies in the region.

The Act - which repeals the 2015 Enhancing Online Safety Act and replaces the existing cyber-bulling scheme targeted at children, aims to promote and improve online safety for Australians. It includes world-first fines for adults who engage in abusive trolling behaviour, and gives the eSafety Commissioner increased powers to enforce safety standards and to pursue action against those who engage in breaches, such as bullying, harassment, or the sharing of non-consensual intimate images.

View the recording of our Briefing on the Act (free for ACM Pro Members)

Basic Online Safety Expectations

The legislation introduces new basic online safety expectations (BOSE), which will become a baseline requirement for digital service providers (such as social media platforms, search engines or ISPs).

The Act requires that those providers must:

  • Take all reasonable steps to ensure people can use services safely

  • Minimise the extent to which harmful content is accessible via the service;

  • Take all reasonable steps to prevent children accessing Class 2 material (X18+’, ‘R18+’)

  • Ensure clear, readily identifiable mechanisms for end-users to report and make complaints about harmful material

  • Respond to notices of investigation or takedown from the eSafety Commissioner (usually within 24 hours)

  • Respond to requests from Commissioner for additional details about complaints filed, the timeframes it took to comply and any corrective measures applied


New child protections

The consolidated protections for young people:

  • Cover material posted by anyone, targeting a child

  • Apply to material that an ordinary, reasonable person would conclude is intended to have the effect of:

    • seriously threatening

    • seriously intimidating

    • seriously harassing; and/or

    • seriously humiliating the subject

  • Allow the eSafety Commissioner to order both platforms and end-users to takedown such content, within 48 hours

  • Can also require specific conditions in that takedown process, including the compelling of an apology.


New adult protections

The Act introduces a world-first scheme to protect adult Australians from online harms such as bullying and harassment.

  • It covers material posted by an adult, targeting an adult

  • The material should be regarded by an ordinary, reasonable person as menacing, harassing or offensive by general community standards

  • An ordinary, reasonable person should conclude that the material was intended to cause 'serious harm' to the target. 'Serious harm' is defined as serious physical harm or serious harm to a person’s mental health, whether temporary or permanent. Serious harm to a person’s mental health includes:

    (a) serious psychological harm; and

    (b) serious distress;

    but does not include ordinary emotional reactions such as distress, grief, fear or anger.

  • The harm can be wrought directly (posting directly to that person) or indirectly (posting material to others than reaches that person through other means)

The eSafety Commissioner can issue removal notices to both services and end-users to take down such material within 24 hours.

Non-compliance can result in civil penalties up to $111,000 for individuals and $555,000 for companies.

For a notice to be issued:

  • it must satisfy the conditions of abuse/harm

  • it must be accessible to Australian end-users

  • it must have been subject to a complaint to the relevant online service provider

  • then, failed to have been removed by the online service provider; and

  • subsequently be subject to a complaint to the eSafety Commissioner


Sharing non-consentual intimate images

The Act has strengthened leglislation around the sharing of non-consentual intimate images (sometimes dubbed ‘revenge porn’), in conjunction with state-based criminal laws addressing image-based abuse.

Now, platforms, hosts and end-users may be each issued take-down notices from the eSafety Commissioner, with a 24 hour requirement for removal.

There are civil penalties for end-users who post these images (up to $55,000 per incident).

Importantly, the Act closes a loophole in some previous governance - it applies to both original and manipulated images featuring the non-consenting subject.


Blocking content

The Act contains new powers for the eSafety Commissioner to prevent rapid amplification of abhorrent violent material, as such as 2019 Christchurch terrorist attack video.

  • The powers allow for the Commissioner to require ISPs or search engines to block access to material that depicts, incites or instructs abhorrent violent conduct

  • The material must be likely to cause significant harm to the Australian community

  • Failure to remove the material quickly can lead to large fines and criminal prosecution


What the Act means for community professionals

If you work on social media platforms:

  • The new Act will hopefully improve platform compliance with addressing serial trolls and abusers, impacting the overall social health of your digital setting

  • Your adult users - including staff that work on the community - can now report abuse to the eSafety Commissioner, as well as the platform itself

  • Your users (including staff) may be reported to the platforms, then the Commissioner, if that engage in behaviour prohibited by the Act (bullying and harassment or adults or children; sharing non-consentual intimate imagery; sharing graphic material classified as X or RC).

If you work on owned communities (inc. apps):

  • You/your organisation need to comply with BOSE, including reporting mechanisms, processes to respond to takedown requests and complaints, requests for information pertaining to compliance (i.e. transparency reporting)

  • Your users can report abuse to the eSafety Commissioner as well as you directly

How to comply with the new Online Safety Act

  • Ensure you are performing consistent, consequential moderation

  • Create or maintain governance protections such as guidelines, terms of use, risk triage and escalation, response guides, etc.

  • Create internal policies and processes on how to respond to notices and requests from the eSafety Commissioner

  • Train staff on those policies and processes

  • Add resources into response guides for members who may have issues with bullying, harassment, trolling or other harms (e.g. link them to the reporting forms/mechanisms, advise them to keep records, etc.)

  • If building social products, ensure they are BOSE compliant

Resources:

Venessa Paech

Venessa is Director and Co-Founder of ACM.

http://www.venessapaech.com
Previous
Previous

State of Community Management: India

Next
Next

Swarm 2021: Call for Speakers