The Online Safety Act 2021 is new legislation that makes Australia’s existing laws for online safety more expansive and much stronger.
Our laws need to keep pace with advances in technology and the threats we face online from harmful behaviour and toxic content. These modern times of rapid change and social upheaval call for robust new laws.
The new powers in the Online Safety Act cement eSafety’s role as a world leader in online safety. They place Australia at the international forefront in the fight against online harm.
What does the Online Safety Act 2021 mean for Australians?
The Act has significant implications for online service providers because it makes them more accountable for the online safety of the people who use their service.
The Act gives eSafety substantial new powers to protect all Australians – adults now as well as children – across most online platforms and forums where people can experience harm.
There is, for the first time, a clear set of expectations for online service providers that makes them accountable for the safety of people who use their services.
The Act also requires industry to develop new codes to regulate illegal and restricted content. This refers to the most seriously harmful material, such as videos showing sexual abuse of children or acts of terrorism, through to content that is inappropriate for children, such as high impact violence and nudity.
What are the main changes in the new Act?
The Online Safety Act:
- creates a world-first Adult Cyber Abuse Scheme for Australians 18 years and older
- broadens the Cyberbullying Scheme for children to capture harms that occur on services other than social media
- updates the Image-Based Abuse Scheme that allows eSafety to seek the removal of intimate images or videos shared online without the consent of the person shown
- gives eSafety new powers to require internet service providers to block access to material showing abhorrent violent conduct such as terrorist acts
- gives the existing Online Content Scheme new powers to regulate illegal and restricted content no matter where it’s hosted
- brings app distribution services and search engines into the remit of the new Online Content Scheme
- introduces Basic Online Safety Expectations for online service providers
- halves the time that online service providers have to respond to an eSafety removal notice, though eSafety can extend the new 24-hour period.
What are the expectations of industry?
Basic Online Safety Expectations
The Act sets out what the Australian Government now expects from online service providers. It has raised the bar by establishing a wide-ranging set of Basic Online Safety Expectations.
These expectations are designed to help make sure online services are safer for all Australians to use. They also encourage the tech industry to be more transparent about their safety features, policies and practices.
The Basic Online Safety Expectations are a broad set of requirements that apply to an array of services and all online safety issues. They establish a new benchmark for online service providers to be proactive in how they protect people from abusive conduct and harmful content online.
eSafety now expects online service providers to take reasonable steps to be safe for their users. We expect them to minimise bullying, abuse and other harmful activity and content. We expect them to have clear and easy-to-follow ways for people to lodge complaints about unacceptable use.
The Minister for Communications, Urban Infrastructure, Cities and the Arts can determine the expectations for certain online services. eSafety then has the power to require online service providers to report on how they are meeting any or all of the Basic Online Safety Expectations.
The Basic Online Safety Expectations are backed by new civil penalties for online service providers that do not meet their reporting obligations.
eSafety will also have the ability to name online service providers that do not meet the Basic Online Safety Expectations, as well as publish statements of compliance for those that meet or exceed expectations.
New industry codes for illegal and restricted content
The Act will also require industry to develop new codes. The codes will be mandatory, and they apply to various sections of the online industry:
- social media platforms
- electronic messaging services
- search engines
- app distribution services
- internet service providers
- hosting service providers
- manufacturers and suppliers of equipment used to access online services
- people who install and maintain equipment.
The codes will require online service providers and platforms to detect and remove illegal content like child sexual abuse or acts of terrorism. They will also put greater onus on industry to shield children from age-inappropriate content like pornography.
The Act allows eSafety to impose industry-wide standards if online service providers cannot reach agreement on the codes, or if they develop codes that do not contain appropriate safeguards.
The Act provides a list of matters the industry codes may deal with. These include making sure that:
- all segments of the industry promote awareness of safety issues and the procedures for dealing with harmful online content on their services
- online service providers tell parents and adults who are responsible for children how to supervise and control children’s access to material they provide on the internet
- online service providers tell users about their rights to make complaints
- online service providers follow procedures for dealing with complaints in line with their company policies.
Codes would be enforceable by civil penalties and injunctions to make sure online service providers comply.
We want to create a modern and fit-for-purpose online safety ecosystem where eSafety and online service providers play a co-regulatory role to protect Australians from illegal and restricted content.
Find out more at : https://www.esafety.gov.au/whats-on/online-safety-act