New law targets AI tools used to create child abuse materials

Posted on 28 Jul 2025

By Nick Place, journalist, Institute of Community Directors Australia

Shutterstock holding child hand
Urgent legislation is required to plug holes relating to AI tools harming children, new bill says.

A proposed law would create a new criminal offence targeting the use of emerging technologies to produce, train or facilitate the generation of child sexual abuse material.

Independent MP Kate Chaney introduced the private member’s bill on Monday morning in federal Parliament, arguing for an urgent legislative response to what she described as the “alarming growth” in AI-generated abuse content, including deepfakes and child-like AI personas.

Speaking to the House of Representatives, Chaney said the government needed to pick up speed on dealing with this issue. “We need an urgent response to this from the government,” she said. “While a holistic view is important, we need to plug the holes in the current legislation to deal with these emerging harms.

“The government has not yet responded to last year’s statutory review of the Online Safety Act and the government acknowledged in 2023 that existing laws likely do not adequately prevent AI-facilitated harms before they occur.”

She added that the bill's legislation was an uncomplicated adjustment to existing laws.

“We need to do everything we can to stop the alarming growth in child sexual abuse material,” Chaney said.

“Deepfakes, AI-generated child sexual abuse material and child-like AI personas can be created by these sophisticated tools and are inundating law enforcement with more material.

“Right now, predators can access and download these sickening tools and train them to generate child sexual abuse materials and then delete them before detection, to evade existing possession laws.”

Speaking in the House, Chaney said the pace of technological advancement was outstripping current regulatory frameworks, creating a dangerous gap in Australia’s criminal law.

“A single image is already illegal, but the capacity to infinitely produce, delete and reproduce abusive images through AI tools represents a new and urgent threat,” she said. “This is by no means the only legislation that needs to be passed on this topic, but this would plug an immediate hole in the criminal code.”

“AI is being weaponised to harm children, and Australia must act.”
ICMEC Australia chief executive Colm Gannon

The legislation follows a national roundtable on child safety and artificial intelligence held in Canberra last week and convened by the International Centre for Missing and Exploited Children (ICMEC) Australia.

The roundtable brought together government agencies, law enforcement, technology experts and child protection advocates to address the growth of AI-enabled child sexual abuse, including synthetic material, deepfakes, automated grooming, and the creation of child-like AI personas.

“AI is being weaponised to harm children, and Australia must act,” ICMEC Australia chief executive Colm Gannon said. “This roundtable represented a pivotal moment for child protection. We need urgent action to ensure these technologies do not outpace our systems of prevention and justice.”

While the roundtable primarily aimed to raise awareness of the issue and its impact on children and families, it also prompted concrete legislative action through Chaney’s bill.

Developed in collaboration with ICMEC Australia, the bill seeks to amend the Criminal Code Act 1995 to create offences related to the misuse of AI and data for child abuse purposes.

The proposed law would make it a criminal offence to download or possess tools intended to generate child sexual abuse material. It would also criminalise the collection, scraping or distribution of data with the intention of training such tools. A public benefit defence would be available for law enforcement and intelligence personnel. Penalties could be up to 15 years' imprisonment.

Chaney said she had met with the Attorney-General's office, who agreed the gap needed to be addressed.

“This bill will limit the ability to generate on-demand, unlimited child sexual abuse material at scale, often tailored to specific preferences, including the use of real children’s images and details,” Chaney said.

“The generation of this material impacts law enforcement’s ability to investigate offences due to the difficulty in distinguishing synthetic material from real material, diverting precious resources from investigating the sexual abuse and exploitation of real children.

“This bill represents a proactive and targeted legislative response, and we urge the government to act.”

The roundtable followed the release of a study in November 2024 by ICMEC Australia and the University of Queensland, which examined the economic cost of online child sexual exploitation and abuse in Australia.

Authored by Jonah R. Rimer and Ethan Callaway, the study identified a major gap in available data on the financial and societal impacts of various forms of online child abuse, including livestreaming, grooming, sextortion and child sexual abuse tourism.

The report urged governments and industry to break down silos, share data and improve knowledge in order to address the growing threat.

More news

Become a member of ICDA – it's free!