Codify — Article

Online Safety Amendment (Fix Our Feeds) Act 2026 — platform opt‑out and digital duty of care

Requires platforms to let Australian users opt out of recommender-driven content and imposes reporting, risk‑assessment, transparency and personal duties on large providers.

The Brief

This bill amends the Online Safety Act 2021 in two linked ways. First, it mandates that providers of social media services must let Australian endusers opt out of receiving recommended (algorithmically prioritized or personalized) content at any time, and it creates a complaints and civil‑penalty regime to enforce that requirement.

Second, it creates a new digital duty of care for regulated online services, with heightened obligations for “large providers,” including reporting on Australian user numbers, mandatory risk assessments and mitigation plans, annual transparency reports, and personal duties on key personnel.

The package shifts substantial operational and compliance obligations onto large platforms: measurable user‑count thresholds trigger enhanced duties, failure to comply attracts severe civil penalties, and the Commissioner gets stronger information‑gathering and audit powers. For compliance officers, product leads and legal teams at global platforms, the bill requires near‑term changes to data collection, recommender design, reporting pipelines and executive governance.

At a Glance

What It Does

The bill forces social media services to implement an opt‑out from recommended content and defines recommended content as material prioritized or personally suggested by a system. Separately, it establishes baseline reporting for all regulated online services and layered obligations for large providers: duty of care, risk assessment and mitigation, transparency reporting, enduser privacy defaults and targeted‑ad opt‑outs.

Who It Affects

Large international and domestic platforms that meet user thresholds (10% of Australia’s population, or absolute counts of 2.6 million users or 630,000 child users) are the primary targets; all regulated service providers must publish user counts. Key personnel of large providers face notification and personal conduct requirements. Researchers, regulators and civil society groups will gain access to new data and public registers.

Why It Matters

This creates statutory obligations that directly target recommender systems and executive accountability—areas previously governed largely by industry practice. The act embeds transparency and auditability into platform oversight, increases the legal and reputational stakes for senior executives, and gives regulators tools to compel data and run compliance audits.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

Schedule 1 amends the Online Safety Act to require social media providers to give every Australian enduser the ability to opt out of “recommended content” — a defined category that covers both algorithmic prioritisation and personalised suggestions. The schedule builds a complaints pathway so endusers can complain to the Commissioner if they cannot exercise that opt‑out.

Noncompliance with the opt‑out requirement is subject to a civil penalty calculated as the greater of 100,000 penalty units or 10% of annual turnover.

Schedule 2 creates a new Part 2A that divides obligations between all providers of regulated online services and large providers. All providers must publish specific user metrics (monthly averages over a six‑month reporting period), starting with a reporting milestone tied to 1 July 2026.

A provider becomes a “large provider” if it reaches either a relative threshold (10% of Australia’s population) or specified absolute user counts, and the Minister publishes a formal designation.Large providers must perform regular risk assessments that look explicitly at recommender systems, moderation, advertising practices and harms including scams, child safety and effects on democratic processes. They must publish those assessments and a mitigation plan each year and implement reasonable steps to carry out the plan.

The Commissioner may issue standards to shape the required content and form of risk assessments and mitigation measures.The bill also requires large providers to produce an annual transparency report with specific metrics (design, child access, scams, content moderation, advertising, monthly user counts), maintain a Public Information Register with those materials, and make research datasets available to qualifying non‑commercial Australian researchers where doing so does not expose protected or personal information or create security risks. Compliance audits are authorised, and the Commissioner is granted an express power to compel documents and interviews with a minimum 14‑day compliance window.Finally, the bill introduces enduser privacy protections for large providers: privacy settings must default to maximum privacy and users must be able to opt out of targeted advertising.

Key personnel notifications and signed fit‑and‑proper declarations (including Australian residency requirements) are required within tight timelines, and an individual executive who fails their duties faces a criminal‑style penalty unit sanction separate from corporate civil penalties.

The Five Things You Need to Know

1

The bill compels social media services to implement an opt‑out for recommended content and makes failure to comply punishable by the greater of 100,000 penalty units or 10% of the provider’s annual turnover.

2

A provider is potentially a “large provider” if it serves at least 10% of Australia’s population, or 2.6 million+ Australian users, or 630,000+ child users; the Minister must formally designate providers as large providers by public notice.

3

All regulated providers must publish, on a public website, the average monthly number of Australian endusers and the average monthly number of Australian child endusers over the last six months, with the first statutory reporting milestone tied to 1 July 2026 and subsequent reports due within 30 days of each six‑month period ending.

4

Large providers must run risk assessments (at least every three years and after certain triggers), produce a risk mitigation plan, publish the assessment and plan annually, and implement reasonable steps to carry out mitigation measures including changes to recommender systems, moderation, advertising and interface design.

5

The Commissioner gains strong investigatory tools: the power to compel documents and interviews (with a minimum 14‑day compliance window), to require compliance audits by independent auditors, and to demand publication of risk assessments and transparency reports; key personnel who fail their duties face a penalty of 500 penalty units.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Schedule 1, section 1–2

Define recommended content and require opt‑out capability

The bill inserts a functional definition of recommended content into the Act to capture both prioritisation and personalised suggestion systems. Practically, that means recommender algorithms, personalised feeds and similar curation layers fall squarely under the new rule. Platforms must implement an opt‑out mechanism for all Australian endusers — the provision does not prescribe UI specifics, but it sets the legal obligation the interface must deliver.

Schedule 1, sections 3–4 (Division 4A)

Complaints pathway and enforcement for opt‑out failures

If an Australian enduser believes they cannot exercise the opt‑out, they can lodge a complaint with the Commissioner. The Commissioner has discretion to investigate and to terminate investigations under existing investigative powers. This creates an administrable enforcement flow: individual complaints can trigger targeted inquiries or larger investigations into platform compliance with opt‑out obligations.

Schedule 1, section 5 (Part 8A — 104B)

Civil penalties for not enabling opt‑out

Section 104B places a clear, high‑stakes financial deterrent on providers that fail to offer opt‑out: a civil penalty set as either a fixed high number of penalty units or a percentage of turnover, whichever is larger. That formula aligns financial exposure with firm scale, increasing the incentive for global platforms to comply and for counsel to treat the obligation as a major legal risk to manage.

4 more sections
Schedule 2, sections 5A–5C and 28B–28C

Definitions, designation and dual layers of obligation

The bill expands definitions so the Act covers designated internet services, electronic services and social media, then creates a two‑tier compliance model. All providers must meet reporting duties; only those that meet the numeric thresholds and receive Ministerial designation are subject to the full suite of duty‑of‑care obligations. The designation step matters operationally because it triggers additional governance, disclosure and personal liability requirements.

Schedule 2, Division 6–7 (28G–28J)

Risk assessments and mitigation plans that target algorithmic systems

Large providers must conduct documented risk assessments that explicitly consider recommender systems, moderation, advertising selection and data practices, and must publish those assessments and a risk mitigation plan annually. The bill authorises standards to specify the assessment format, and lists mitigation options—from interface changes to algorithm testing and targeted child‑safety measures—so compliance teams will need cross‑functional processes between product, safety, legal and engineering to satisfy both the substance and the publication requirements.

Schedule 2, Division 8–9 (28L, 28N, 28P, 28Q)

Transparency reporting, research access and enduser privacy controls

Large providers must publish an annual transparency report with specified metrics (design and functioning, child access, scams, moderation, advertising, monthly user numbers) on a defined timetable, and maintain a searchable Public Information Register that aggregates published materials. Researchers meeting narrow, non‑commercial Australian criteria can request research datasets already established for research and providers must make them accessible unless privacy or security concerns apply. Separately, privacy settings must default to maximum and users must be able to opt out of targeted advertising—changes that will directly affect adtech operations and revenue models.

Schedule 2, Division 5 and enforcement (28F, 28R, 194A–194B)

Key personnel duties, auditing and regulator information powers

Large providers must notify the Commissioner when key personnel change and supply signed declarations (fit‑and‑proper, residency, legislative‑rule requirements) within 14 days. The Commissioner can require documents, compel interviews and retain copies, and appoint independent auditors to perform compliance audits. These provisions make governance processes and executive due diligence focal points of regulatory compliance, since personal penalties for executives and mandated audits raise both legal and board‑level stakes.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Technology across all five countries.

Explore Technology in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Australian endusers who want control over algorithmic feeds — the statutory opt‑out and default maximum privacy settings give users a concrete technical lever to reduce algorithmic curation and targeted advertising.
  • Children and child‑safety advocates — the law forces platforms to count child use, assess child‑specific risks, include child‑safety measures in mitigation plans and report child‑related metrics annually, increasing visibility into harms and protective measures.
  • Academic and non‑profit researchers in Australia — qualified researchers can access established research datasets and, where technically feasible, receive data in realtime for non‑commercial studies, improving independent study of platform effects.
  • Regulators, consumer groups and journalists — a Public Information Register, mandated transparency metrics and compulsory risk assessments create new, searchable sources of information for oversight, research and public accountability.
  • Public health and democracy advocates — required attention to electoral impacts, scams and gender‑based violence in risk assessments raises the profile of these systemic harms and channels provider resources toward mitigation.

Who Bears the Cost

  • Large platform providers — they face system redesigns (opt‑out and privacy defaults), expanded data collection and publication pipelines, costs for risk assessments, mitigation measures, and potential high civil penalties linked to turnover.
  • Key personnel and senior executives of designated large providers — they must provide fit‑and‑proper declarations, be resident in Australia, and face a 500‑penalty‑unit sanction for failing to take reasonable steps to prevent material contraventions.
  • Adtech and third‑party advertisers — the opt‑out of targeted advertising and stricter advertising‑system reporting may reduce targeting precision and increase compliance costs for ad networks and agencies.
  • Mid‑sized providers near thresholds — these firms will need to resource additional reporting and governance work to avoid sudden designation or to comply once designated, creating a compliance cliff for services growing fast in Australia.
  • Regulators and ACMA — enforcement requires staff, technical capability and procedures to review large volumes of published data, run audits and handle compelled disclosures; implementation will need budget and capability increases.

Key Issues

The Core Tension

The central dilemma is balancing stronger individual control, transparency and protection from platform harms against the operational, legal and innovation costs of forcing platforms to re‑engineer recommender systems and governance. The bill seeks to shift the risk calculus toward enduser safety and oversight, but that same shift raises questions about measurement, platform functionality, executive accountability across borders and the regulator’s capacity to turn obligations into meaningful, proportionate outcomes.

The bill leaves critical implementation detail to legislative rules and Commissioner standards. That delegation helps tailor technical requirements to fast‑moving systems but also creates near‑term uncertainty about what risk assessments and transparency reports must contain.

Platform teams will need to monitor both the Act and the subsequent rules closely and prepare flexible reporting and documentation systems.

Several practical frictions arise. First, user counting and the threshold formula invite gaming and measurement disputes (definition of an “Australian enduser,” session‑based counts vs. unique accounts, bots and shared devices).

Second, mandating opt‑outs for recommender systems risks reducing platforms’ ability to surface safety interventions that rely on algorithmic signals; firms may respond by narrowing recommendations rather than redesigning models, producing unforeseen changes in content discovery. Third, the residency and signed declaration requirement for key personnel raises international hiring and governance questions for global firms and may conflict with broader corporate governance or immigration realities.

Finally, the enforcement architecture assumes the Commissioner and ACMA can operationalise new audit and disclosure regimes at scale. The regime’s deterrent effect depends on credible auditing and timely, well‑targeted enforcement; without resourcing and clear standards, the rules risk producing box‑ticking compliance or litigation over delegated details rather than substantive harm reduction.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.