The Kids Online Safety Act creates a framework to protect minors on internet platforms by defining what counts as a covered platform and the harms to address. It requires platforms to implement policies and safeguards against threats of physical harm, sexual exploitation, and financial or deceptive harms targeting minors, while considering platform size and feasibility.
The bill also sets up disclosure requirements, a robust audit regime, enforcement mechanisms, and a dedicated council to advise on best practices and future actions.
In short, the Act aims to standardize protections for minors, empower parents with controls, and establish ongoing oversight to push platforms toward safer designs and accountable governance. It does not, however, impose blanket censorship or restrict legitimate expression; instead, it creates measurable safeguards, reporting, and accountability structures intended to reduce online risks for young users.
At a Glance
What It Does
Defines a covered platform and design features, and requires platforms to address harms to minors through policies, safeguards, and reporting mechanisms. Establishes an audit regime and enforcement framework.
Who It Affects
Covers internet platforms (websites, software, apps, services) used by minors under 17, their parents, and regulators (FTC, states).
Why It Matters
Creates uniform, enforceable protections for minors online, promotes accountability through audits, and guides industry practices via a dedicated council.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The act starts by defining who must comply: any platform connected to the internet that can be used by minors and that collects personal information, with specific definitions for key terms like the minor, design features, and a covered platform. It then sets out the harms that platforms must work to prevent, including threats of physical violence, sexual exploitation, and the distribution or use of illegal or harmful products, as well as deceptive financial practices affecting minors.
Platforms must tailor their policies to their size and technical feasibility.
On safeguards, the bill requires platforms to provide easy-to-use protections for known minors, including limits on other users’ communications, protections against compulsive usage, and strong default privacy and safety settings. Parents get tools to manage privacy, account settings, and purchases, plus visibility into how time is spent on the platform.
If a child is involved, platforms must notify the parent and obtain verifiable parental consent where applicable, with the option to consolidate notices with COPPA compliance.For accountability, the Act imposes a mandatory annual audit by an independent third party, reporting back to the FTC. Audits cover access by minors, time spent, existing safeguards, parental tool usage, and how the platform handles harm reports.
The bill also creates a Kids Online Safety Council under the Secretary of Commerce to issue reports and recommendations and to promote best practices. Enforcement runs through the FTC and state attorneys general, with cooperation and limited state action where appropriate.Finally, the act clarifies relationships with existing laws, preserves free expression, and sets an 18-month effective date, allowing platforms time to implement the required safeguards, parental tools, and audit capabilities.
The Five Things You Need to Know
The bill creates a defined set of ‘covered platforms’ and ‘design features’ that influence how minors’ data and attention are managed.
Platforms must address harms to minors (violence, exploitation, narcotics/tobacco/alcohol risks, deceptive financial practices) with policies tailored to size and feasibility.
Safeguards for minors include default protective settings, user- and parent-facing controls, and mandatory parental tools and notices.
An independent third-party audit is required annually, with a detailed report to the FTC and a framework for measuring effectiveness.
A Kids Online Safety Council will advise Congress on risks, best practices, and standards for audits and child safety in online environments.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Definitions
The section defines key terms, including child (under 17), Commission (FTC), covered platform (a publicly accessible platform with user accounts and features designed to encourage engagement), design feature (mechanisms that increase usage), and other terms like verifiable parental consent and personal information. This creates the scope for obligations and sets boundaries for compliance and enforcement.
Preventing Harm to Minors
Platforms must establish policies addressing threats of physical violence, sexual exploitation and abuse, and the distribution or use of narcotics, tobacco, cannabis, gambling, or alcohol. The policies must be tailored to the platform’s size and feasibility, and the section preserves freedom of expression by excluding requirements that would chill lawful speech.
Safeguards for Minors
This section requires safeguards for known minors, including limiting inter-user communications, reducing compulsive usage through default design features, and providing readily accessible parental tools for privacy, account controls, purchases, and time management. Defaults should be the most protective settings available.
Disclosure
Before registration or purchase by a minor, platforms must disclose policies for safeguards and provide information about accessing parental tools. It also requires parental notice and acceptance, with a COPPA-aligned framework for compliance and a consolidated notice process where feasible.
Audit; Report
Platforms must hire independent third-party auditors to assess how minors access the platform, time spent, safeguards, parental tools, and reporting processes. Audits include descriptions of available tools, how reports are handled, and how personal information of minors is collected or processed. Platforms must submit audit results to the FTC within a defined timeline.
Enforcement
Enforcement mirrors FTC authority for unfair or deceptive acts or practices, with both federal and state options for action. State attorneys general can sue to enjoin violations, enforce compliance, and seek damages, subject to coordination with federal actions.
Kids Online Safety Council
The Secretary of Commerce establishes a council to identify risks, benefits, best practices, and research directions. It must include academics, researchers, parents, educators, platform representatives, and civil liberties experts, and deliver a final report with recommendations within three years.
Rules of Construction
This section preserves existing privacy laws (like COPPA), clarifies it does not expand Section 230, and confirms that compliance tools can be implemented without limiting other lawful activities. It also clarifies that nothing here should prevent platform cooperation with law enforcement or data-security measures.
This bill is one of many.
Codify tracks hundreds of bills on Privacy across all five countries.
Explore Privacy in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Minors under 17 gain safer online experiences through default protections and easier access to parental controls.
- Parents and guardians gain visibility and control over privacy settings, purchases, and time spent by their children.
- Academic researchers and public health organizations benefit from council input and standardized best practices to study and mitigate online harms.
- Covered platforms gain a clearer, uniform regulatory framework that can guide product design and compliance strategies.
- State attorneys general and the Federal Trade Commission receive a structured enforcement regime and cooperative mechanisms to protect residents.
Who Bears the Cost
- Platforms must invest in safeguards, parental tools, and annual audits, with costs scaling to platform size and complexity.
- Smaller platforms may face proportionally heavier compliance costs relative to their size.
- Advertisers and partners may face constraints on targeting or content that reaches minors, potentially affecting revenue streams.
- State and federal agencies incur administrative costs to implement, monitor, and enforce the new framework.
- Potential legal exposure and civil penalties for noncompliance create ongoing compliance risks for platform operators.
Key Issues
The Core Tension
The central tension is between the need for strong, standardized protections for minors and the practical costs and potential constraints on legitimate speech and innovation. The bill seeks to impose audit-driven accountability and design-based safeguards, but platforms must implement these tools without stifling user autonomy, privacy, or free expression.
The bill creates a comprehensive safety framework, but it also introduces implementation challenges. Platforms must balance robust safeguards with usability and accessibility, avoiding overbroad restrictions that could chill legitimate content or innovation.
Audits require access to a platform’s data and systems, raising concerns about operational overhead, data governance, and vendor management. The interaction with state laws and existing privacy regimes must be navigated to prevent duplicative or conflicting requirements.
Finally, the Council’s recommendations will take years to translate into firm standards, and Congress may wish to revisit funding and scope as technology and user behavior evolve.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.