On the 30 August, the eSafety Commissioner Julie Inman Grant issued legal notices to tech giants Google, Apple, Meta, Microsoft, Snap and Omegle questioning what they are doing to protect their users specifically with regard to child exploitation material. 

The call to action has been prompted by the expectations set in the Online Safety Act 2021 where the government outlines basic online safety expectations (BOSE) required for these online organisations to operate in Australia. The Safety Act commands transparency and accountability. Each company has been tasked to report on and create codes of conduct with regard to content including child exploitation material, pornography, violence, illegal substances, and acts of crime or terrorism. The Online Safety Act 2021 was designed to and is committed to protect Australian’s, specifically children, online so much so that should the companies fail to respond within 28 days they could face fines up to $5 million per day. 

Most have declared they have protections and policies in place but yet child exploitation, cyberbullying and other serious crimes are still being carried out on these platforms. The Commissioner believes platforms with live streaming, the opportunity for anonymity and encryption provides an ideal environment for predators and criminals and has been clear that as an industry they all need to do better. 

Six industry associations representing companies Google, Apple, Meta and Microsoft released their draft code of conduct soon after the legal notices were issued. These associations have been working on the code of conduct since the Online Safety Act was passed on the 23 June 2021. These draft codes are stage 1 of the conduct addressing the most harmful content (Child exploitation and terrorism) and once the codes have been put through the public consultation process (www.onlinesafety.org.au), which ends on October 2, they will be submitted for registration with the eSafety Commissioner. Then stage 2 will address all other illegal and harmful content. As part of the code of conduct companies will be required to report harmful content to the eSafety Commission and relevant law enforcement authorities within 24 hours.

Another area the Online Safety Act will be focusing on is the rise of the Metaverse and the ability for banks to now create new forms of trade and commerce within Metaverse platforms. Some of the most recognised Metaverse platforms are Fortnite, Minecraft, Roblox and Decentraland. 

While these platforms are marketed to children and young adults commercialising the Metaverse could provide a stronger opportunity for adults and children intermingling which is a risk. 

On September 13, Julie Inman Grant spoke at the Trans-Tasman Business Circle Webinar led by ANZ Banking Group, Neil Dobson. Neil outlines the possibilities for customers to do business in the Metaverse but acknowledged “All the real-world things people expect of governments, regulators and convention need to exist in the metaverse as well.” 

Julie agreed that this emerging industry needs to have a robust, proactive, and accountable approach to user safety especially when it comes to the safety of children stating, “the burden of safety should never fall upon the user.”

There is still a way to go with the execution of the Online Safety Act 2021 and it will be an ongoing and ever-evolving task to regulate online content to make it safe for all users, however with the eSafety Commissioner Julie Inman Grant at the helm you know she will be fighting for it every step of the way.