What Australia’s new online safety laws mean for you

The Online Safety Act 2021 (Cth) (Act) comes into effect from 23 January 2022. The purpose of the Act is to build on the existing online regulatory framework established by the Enhancing Online Safety Act 2015 (Cth) and to give the eSafety Commissioner greater investigative and enforcement powers.

The Act establishes a cyber abuse scheme which has unique provisions for adults and for children. It gives the Commissioner the power to require the removal of online material that might threaten, intimidate, harass, cause menace, offend, humiliate or is likely to cause serious harm. It reduces the timeframe for online service providers to respond to a take-down notice from 48 to 24 hours. If the material is not removed, civil penalties can be imposed, including fines up to $111,000 for individuals and $555,000 for companies, on those who posted it and on the provider of the service where it appears. 

There is no doubting the impact of this Act to protect all Australians. However, the vulnerability of children to negative online experiences was a specific focus for regulation. In 2018, the Commissioner found that 81% of children were online by the age of 5 and 99% of parents with children aged 2 to 17 reported having an internet connection in the home.1 The Commissioner also reported that 6% of pre-schoolers (aged 2 to 5) are accessing social media while 20% are accessing multiplayer online games.2

One way in which this legislation addresses children’s interactions with social media services is through the implementation of a set of basic online safety expectations (BOSE), which set out community-led expectations and best practice for social media services to prevent online harms.3 The Commissioner has the capacity to require social media services to report (both publicly and to the regulator) on their actions to comply with the BOSE, and to impose financial penalties for failing to meet these reporting obligations. The estimated cost to businesses of uplifting online safety practices and producing transparency reports is $178,000 per annum.4

In the month of November 2021, Facebook had 17 million active Australian users, WhatsApp recorded 12 million, Instagram recorded 10 million, and Snapchat recorded 6.4 million.5 To date, legislative measures to address the online safety risks involved with the use of these apps have not kept pace with the rapid the rate of technological change and the emergences of new platforms and services. The Act aims to be device and platform neutral and be flexible enough to respond to future changes in technology, industry practices and user habits.

Who will the Act impact?

The Act will not only impact the way in which Australian end-users interact with online platforms, but it will also impact how online service providers provide services on those platforms. Specifically, the Act will impact providers of:

  1. social media services, including social networks, media sharing networks, discussion forums and consumer review networks;
  2. relevant electronic services, including email services, instant messaging services, SMS and MMS services, chat services, online games where end-users can play against each other and online dating services;
  3. designated Internet services, such as websites (unless a service is otherwise considered a social media service or a relevant electronic service);
  4. search engine services, including software-based services designed to collect and rank information on the world wide web in response to user queries (excluding search functionality within platforms where content or information can only be surfaced from that which has been generated/uploaded/created within the platform itself);
  5. app distribution services, all providers of app distribution services, excluding links to an app distribution services and downloads of apps from third party websites;
  6. hosting services, all hosting services providers which host stored material in Australia (for example, where a service has data centres located in Australia);
  7. internet carriage services, all internet service providers who provide internet access to customers in Australia including host stored material in Australia; and
  8. manufacturing, supplying, maintaining or installing equipment, including mobile phones, laptops, tablets, internet-enabled devices (such as smart TVs and gaming consoles) and immersive technologies (such as virtual reality headsets) and connect to the internet, such as wi-fi routers; this section of the online industry includes manufacturers of these devices, as well as businesses and retail outlets that install, sell and/or repair/maintain such devices.

Commissioner’s Powers?

The Act grants the Commissioner a range of enforcement powers, including civil penalties, formal warnings, infringement notices, enforceable undertakings and injunctions. The Commissioner also has powers to:

  1. obtain information, from a social media service, relevant electronic service or designated internet service, about the identity or contact details of an end-user using anonymous accounts if the Commissioner has reasonable grounds that the information is, or the contact details are, relevant to the operation of the Act;6
  2. conduct an investigation and summon a person to attend before the Commissioner to produce documents or to answer questions under oath or affirmation;7 and
  3. do all things the Commissioner deems necessary or convenient to be done for or in connection with the performance of its’ functions.8


This article only contains a snapshot of Australia’s online reforms that come into effect in 2022. The Act and subsequent industry codes are far reaching; it is important for online services businesses to familiarise themselves with their obligations under the Act and to ensure their practices are up to date and that they are positioned to appropriately deal with any urgent take-down notice requests.

For more information please refer to the eSafety Commissioner website here and if you would like legal advice on how the Act might impact you or your business, please contact Iain Freeman.

Disclaimer – the information contained in this publication does not constitute legal advice and should not be relied upon as such. You should seek legal advice in relation to any particular matter you may have before relying or acting on this information. The Lavan team are here to assist.
Iain Freeman
James Barrett
Senior Associate
Cyber & Data Protection


[1] eSafety Commissioner, State of Play – Youth, Kids and Digital Dangers, (Sydney: eSafety Commissioner), p8, available at: https://www.esafety.gov.au/sites/default/files/2019-10/State%20of%20Play%20-%20Youth%20kids%20and%20digital%20dangers.pdf.

[2] eSafety Commissioner, Digital Parenting – Supervising pre-schoolers online, (Sydney: eSafety Commissioner), available at: https://www.esafety.gov.au/about-us/research/digital-parenting/supervising-preschoolers-online.

[3] Online Safety Act 2021 (Cth), Division 2.

[4] Estimation provided for in the Explanatory Memorandum, Online Safety Bill 2021 (Cth). This assumes a projected 6 large businesses, 60 medium businesses and 50 small businesses accessible in Australia at the conclusion of a 10- year period that would each need to produce on average one transparency report per year, and undertake actions to uplift their practices.

[5] “Social Media Statistics Australia – November 2021”, SocialMediaNews.com.au, available at: https://www.socialmedianews.com.au/social-media-statistics-australia-november-2021/.

[6] Online Safety Act 2021 (Cth), Part 13.

[7] Online Safety Act 2021 (Cth), Part 14.

[8] Online Safety Act 2021 (Cth), section 28.