- Have any questions?
- 02 9247 6000
- media@commsroom.co
- Have any questions?
- 02 9247 6000
- media@commsroom.co
It calls for increased accountability from social media companies through a digital duty of care and the integration of safety-by-design principles into all platform technologies.
The report release coincided with International Justice Mission’s (IJM) Australian Parliament House event where Australian and Filipino leaders with lived experience of online child sexual abuse and exploitation spoke directly to parliamentarians and government agencies on the need for a stronger Online Safety Act.
Both the Minister for Communications Hon Michelle Rowland MP and Shadow Minister for Communications Hon David Coleman MP attended the event and expressed support for the establishment of a digital duty of care.
IJM Australia CEO David Braga said, “Child sexual abuse material is produced and distributed through everyday technology. It is available to be live streamed on common social media platforms, stored on cloud storage services, and created on camera-enabled devices. This needs to stop.”
The Committee heard evidence from the Australian Federal Police (AFP) that the growth of the internet and social media platforms has led to the proliferation of child sexual abuse material.
Read also: New law mandates proactive safety measures by tech companies
Reports of online child exploitation to the AFP have tripled from 2018 to 2023 (14,285 to 40,232), mimicking the growth in reports to the National Centre for Missing and Exploited Children for last year, who reported an 87% increase in reports from 2019 (over 36 million reports in 2023).
International Justice Mission’s 2023 report, Scale of Harm, uncovered that a child growing up in the Philippines has a 1 in 100 chance of being trafficked to produce new or live streamed child sexual abuse material at the direction of a western offender.
“The Joint Select Committee’s recommendations send a strong signal to social media companies that they must ensure their product designs and business models do not facilitate harm to children, including online sexual abuse or exploitation,” Braga said.
“By emphasising safety by design, the onus on technology companies switches from being reactive to proactive. Social media companies operating in Australia should be asking themselves what they need to do to prevent existing, new and live streamed child sexual abuse material from being created, stored or distributed on their products and services. And then do it.
“A comprehensive digital duty of care must include obligations for tech companies to undertake risk assessments and mitigation measures, and to continually monitor the effectiveness of those measures.
“We look forward to working closely with the Australian Government as they draft the parameters for this important digital duty of care.”
In a recent article, John Tanagho, Executive Director of IJM’s Centre to End OSEC, highlighted possible ways in which companies could execute a legal duty of care through existing tools which can prevent child sexual abuse material from being created in the first place.
The Committee also acknowledged the way in which social media facilitated scams, and victimised people here in Australia, especially older Australians and those with CALD backgrounds, and fuelled the trafficking and mistreatment of people recruited fraudulently into the scamming industry.
IJM Australia CEO David Braga was quoted in the Committee’s report as saying, “Social media also plays a role in facilitating the scamming industry run by organised crime and fuelled by a workforce who is often deceptively recruited by ads on social media. Individuals are trafficked across country borders and confined inside gated scam compounds. They are then forced and coerced with threats and actual violence to scam Australians…”
IJM has helped local authorities remove, care for and support victim identification for over 400 individuals whom it determined to be victims of forced scamming within Southeast Asia’s scam compounds. Across APAC, IJM proactively coordinates with government agencies and diplomatic missions to help facilitate rescues and repatriation of the victims and connect with government and non-government partners to provide legal and psychosocial support.
Adrianne Saplagio is a Content Producer at Comms Room, where she combines her passion for storytelling with her expertise in multimedia content creation. With a keen eye for detail and a knack for engaging audiences, Adrianne has been instrumental in crafting compelling narratives that resonate across various digital platforms.