Protecting Aussie kids online: eSafety Commissioner questions tech giants on age assurance

Australia’s eSafety Commissioner has requested that leading social media and messaging platforms reveal how many Australian children are using their services and detail the age verification measures they have implemented to enforce age restrictions.

eSafety sent questions to Google’s YouTube, Meta’s Facebook and Instagram, TikTok, Snap, Reddit, Discord, and Twitch using new transparency powers under the updated Basic Online Safety Expectations Determination.

The amended Determination expands eSafety’s authority to require industry reporting and sets expectations for companies to take steps to keep users safe, including being transparent with the regulator.

The information requests coincide with National Child Protection Week and eSafety’s launch of new online safety resources to protect children and help parents and carers address online risks.

eSafety Commissioner Julie Inman Grant emphasised the need for a multi-pronged approach to online safety. She noted that while imposing age limits is an option, better information is necessary to understand what will be effective and to support children in strengthening digital resilience and critical reasoning skills.

Inman Grant highlighted the ongoing conversation about social media’s potential harm to children, citing research that shows almost two-thirds of 14-17-year-olds have viewed harmful content in the past year. However, she also acknowledged that teens benefit from social media.

A key focus of the information requests is gathering data on the number of children using these platforms and their ages, as well as assessing age assurance readiness and the effectiveness of existing age enforcement measures.

Read also: Rising cyberbullying cases prompt new resources from eSafety Commissioner

“We are having a really important conversation in this country right now about the potential damaging effects social media might be having on our children and our research shows that almost two-thirds of 14-17 year-olds have viewed potentially harmful content in the past year including drug use, self-harm and violent images, but we also know that teens get many benefits from social media.”

“A key aspect of this conversation is having some solid data on just how many kids are on these platforms today and the range of their ages which is a key focus of these requests.”

“These resources empower parents and educators to support children when things go wrong online. Open conversations about technology use and encouraging help-seeking behavior are crucial,” Inman Grant said.

The eight companies have 30 days to respond to the eSafety Commissioner, and appropriate findings will be summarised in a public report.

Adrianne Saplagio is a Content Producer at Comms Room, where she combines her passion for storytelling with her expertise in multimedia content creation. With a keen eye for detail and a knack for engaging audiences, Adrianne has been instrumental in crafting compelling narratives that resonate across various digital platforms.

Share
Adrianne Saplagio
Adrianne Saplagio
Adrianne Saplagio is a Content Producer at Comms Room, where she combines her passion for storytelling with her expertise in multimedia content creation. With a keen eye for detail and a knack for engaging audiences, Adrianne has been instrumental in crafting compelling narratives that resonate across various digital platforms.