The bill inserts a new Subdivision DA into the Criminal Code that makes it a serious offence to download, access, supply, enable access to, or offer technology whose sole or dominant purpose is to create child abuse material — and to collect, scrape or distribute data with the intention of training or creating such technology. Both offences require that the conduct occurs using a carriage service and attract a maximum penalty of 15 years' imprisonment; the carriage‑service element is subject to absolute liability.
The measure also sets out limited defences: conduct of "public benefit" (enforcement, monitoring, administration of justice, or scientific/medical/educational research approved in writing by the AFP Minister) and reasonable conduct by law‑enforcement, intelligence or security officers acting in the course of duty. The bill tightens criminal exposure for AI developers, data aggregators and online intermediaries while carving a narrow, administratively‑gated research exception—creating compliance, evidentiary and operational questions for industry and investigators alike.
At a Glance
What It Does
Creates two new offences: (1) accessing, supplying or offering technology whose sole or dominant purpose is to create child abuse material, and (2) collecting, scraping or distributing data with the intention of training or creating such technology. Both offences require use of a "carriage service" and carry a maximum penalty of 15 years; the carriage‑service element is treated as absolute liability.
Who It Affects
AI developers, model distributors, data aggregators and web‑scraping services, online marketplaces and hosting platforms, and carriage‑service providers (internet/telecom intermediaries). It also directly involves law‑enforcement and the AFP Minister through a ministerial approval route for research exceptions.
Why It Matters
The bill targets technologies and training data as criminal vectors for child sexual abuse material, not just final images or videos—expanding criminal law into development and distribution activities. The absolute‑liability carriage requirement and narrow, minister‑approved research defence raise compliance and jurisdictional questions for entities operating across borders.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The bill adds a focused criminal scheme aimed at technologies that are designed to produce child sexual abuse material. One offence covers acts performed in relation to the technology itself—downloading, accessing, supplying, enabling access, or offering to supply—so long as the technology’s sole or dominant purpose is to create child abuse material and the act occurs using a carriage service.
A companion offence targets the data side: collecting, scraping or distributing data via a carriage service when the person intends that data be used to train or create such a technology. Both offences carry a maximum sentence of 15 years.
A technical but important drafting choice is that the carriage‑service element attracts absolute liability. Practically, that means prosecutors need not prove fault about whether the accused used a carriage service; the other mental elements (such as intention or the purpose of the technology) remain relevant where stated.
The bill also introduces a statutory defence framework: conduct that is "of public benefit" (limited to law enforcement, monitoring compliance, administration of justice, or scientific/medical/educational research approved in writing by the AFP Minister) is excused to the extent it does not exceed what is necessary, and law‑enforcement/intelligence/security officers acting reasonably in their duties are likewise protected. The statute places an evidential burden on defendants to raise these defences.Operationally, the wording captures a wide range of artifacts: it reaches software, models, hosting access, offers on marketplaces, and data flows used to train models.
The "sole or dominant purpose" threshold targets tools primarily designed to generate abusive material but leaves open edge cases for multipurpose systems. The research carve‑out requires written AFP Minister approval for academic or medical projects, which creates an administrative gate for legitimate researchers and a discretionary control point for the Commonwealth.As drafted, the provision interfaces with existing CSAM offences by shifting focus upstream to enablers: those who build, distribute or prepare data for generative models.
That upstream focus forces affected organisations to reassess due diligence, content‑moderation, contractual terms with suppliers, and cross‑border data practices, because carriage‑service use and offers to supply can occur across complex supply chains.
The Five Things You Need to Know
The bill creates two distinct offences (technology conduct and data conduct) that each carry a maximum penalty of 15 years' imprisonment.
Both offences require that the conduct occur using a "carriage service," and the statute makes the carriage‑service element a matter of absolute liability.
The criminal prohibition applies where the technology’s "sole or dominant purpose" is to create child abuse material, drawing a purpose‑based line rather than outlawing all generative tools.
Collecting, scraping or distributing data becomes an offence only when done with the intention of training or creating technology whose sole or dominant purpose is to produce child abuse material.
Defences are narrow: public‑benefit activities are permitted only for enforcement, monitoring, justice administration, or AFP‑minister‑approved scientific/medical/educational research, and defendants carry an evidential burden to raise these defences.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
New offence cluster targeting technologies and data used to create child abuse material
The bill introduces a discrete Subdivision DA that groups together three items: an offence directed at handling technology for creating child abuse material, an offence directed at data collection and distribution for training such technology, and a tailored defence regime. Grouping these provisions signals Parliament’s intent to treat enablers—both code and datasets—as a distinct category of criminal conduct, not merely extensions of image‑level CSAM offences.
Offence for downloading, supplying or enabling access to offending technology
This section criminalises a range of interactions with the technology itself—download, access, supply, enabling access and offering to supply—so long as the act occurs using a carriage service and the technology’s sole or dominant purpose is to generate child abuse material. Including offers and enabling activity targets marketplaces and intermediaries; the practical effect is that firms that list, broker or provide access to such models can fall within the scope even if they never host the final images.
Offence for collecting, scraping or distributing data with intent to train such technology
This provision focuses on the dataset side: anyone who collects, scrapes or distributes data via a carriage service with the intention that it be used to train or create technology whose sole or dominant purpose is to produce child abuse material commits an offence. The mens rea requirement centres on intention to train or create; proving that intention—especially where datasets are multi‑purpose—will be a key evidentiary challenge for prosecutions and a compliance headache for data services.
Limited defences and who bears the burden to raise them
The defence framework permits conduct that is "of public benefit"—strictly defined to enforcement, monitoring, justice administration and research approved in writing by the AFP Minister—and protects law‑enforcement, intelligence and security officers acting reasonably in their duties. The bill makes the question whether conduct is of public benefit one of fact and places an evidential burden on defendants to adduce that they fall within the carve‑outs, rather than reversing the full legal burden.
This bill is one of many.
Codify tracks hundreds of bills on Criminal Justice across all five countries.
Explore Criminal Justice in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Children and victim‑protection organisations — by expanding criminal liability upstream, the bill targets creators and enablers of abusive material rather than only final distributors, potentially reducing the supply of new exploitative content.
- Law‑enforcement and intelligence agencies — the statute explicitly protects officers acting in the course of duties and provides a ministerially‑approved research route, giving agencies formal legal clarity for investigative techniques and controlled testing.
- Platforms and content‑safety teams seeking clearer prohibitions — the law defines a specific unlawful category (technology whose sole or dominant purpose is to create child abuse material), which can help platforms distinguish clearly illicit tools from legitimate services when drafting acceptable‑use policies.
Who Bears the Cost
- AI developers, model hosts and small research labs — the intent and purpose tests, plus the absolute‑liability carriage element, expose creators and distributors to severe criminal penalties and may force costly compliance, code audits, and legal reviews.
- Data aggregators and web‑scraping services — collecting or passing datasets could attract criminal liability if intent to train disallowed models can be alleged, increasing due‑diligence costs and contractual controls on buyers and downstream users.
- Online intermediaries and marketplaces — offering access or enabling others to use models may be captured even when core hosting happens elsewhere, creating risk for platforms and hosting providers and potentially prompting pre‑emptive blocking or deplatforming.
- Academic and medical researchers — the requirement for AFP Minister written approval for research to qualify as "public benefit" adds an administrative barrier and legal uncertainty that could deter legitimate projects or delay critical studies.
Key Issues
The Core Tension
The central dilemma is balancing the public interest in shutting down technologies and datasets that enable child sexual abuse material against the need to preserve legitimate research, law‑enforcement testing and multipurpose technological development; the bill addresses the first clearly but does so by imposing a narrow, administratively‑gated path for the latter, creating a real risk of over‑criminalisation or chilled beneficial activity.
The bill solves a clear policy gap by moving upstream to tools and training data, but it leaves several implementation questions unresolved. The "sole or dominant purpose" standard will be litigated: multipurpose models that can generate both lawful and unlawful outputs are common, and prosecutors must show purpose without necessarily being able to prove the inner intent of developers.
Similarly, proving the subjective "intention" to train a model using particular datasets will often depend on circumstantial evidence—contracts, communications and access logs—creating high evidentiary burdens and potential for contested prosecutions.
The absolute liability for the carriage‑service element reduces a defendant’s ability to argue lack of fault about how content traversed networks, but it also broadens exposure for entities that unknowingly route or store material via a carriage service. The research carve‑out is narrow and requires written AFP Minister approval for scientific, medical or educational projects, creating a discretionary gatekeeper and potential chilling effect on benign research.
Finally, practical questions remain about how the offence interacts with extraterritorial operations, how encryption or offline development is treated, and whether existing civil‑regulatory remedies (takedown, industry codes) will be sufficient or duplicated by criminal enforcement.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.