Codify — Article

Kids Online Safety Act (SB1748) — platform duties, audits, and algorithm transparency

Creates a duty of care for platforms toward minors, mandates parental safeguards and annual third‑party audits for large social platforms, and forces algorithm transparency and opt-out options.

The Brief

The Kids Online Safety Act imposes broad safety and transparency obligations on online platforms, online video games, messaging apps, and streaming services that are used or reasonably likely to be used by minors. The bill creates a statutory duty of care requiring platforms to anticipate specific harms to minors (including eating disorders, compulsive use, self‑harm, sexual exploitation, substance use, and financial harms) and to design and implement safeguards and parental tools to prevent those harms.

The bill also forces disclosure and accountability: platforms that predominantly host user‑generated content and average over 10 million monthly U.S. users must commission independent, third‑party audits and publish annual transparency reports; all covered platforms must provide clear notices about personalized recommendation systems, opt‑outs for minors, and easy switches to less personalized algorithmic views. Enforcement is through the Federal Trade Commission (against unfair or deceptive practices) with concurrent State attorney‑general authority; the Act also requires FTC guidance, an interagency age‑verification study, and establishes a Kids Online Safety Council to advise Congress.

At a Glance

What It Does

The bill creates a statutory duty of care for covered platforms to prevent enumerated harms to minors and requires readily accessible safeguards and parental tools, with default protective settings for users known to be children. It requires disclosure about recommendation systems and, for large social platforms, annual third‑party audits and public reports based on those audits.

Who It Affects

Covered platforms include social media, online video games, messaging apps, and video streaming services that are used or reasonably likely to be used by minors; certain narrow services (email, enterprise B2B tools, government .gov sites, schools, and some conferencing services) are excluded. Platforms averaging over 10 million monthly U.S. active users and that host user‑generated content face the most extensive audit and reporting duties.

Why It Matters

This bill changes the compliance landscape for digital platforms: it shifts responsibility from voluntary safety practices to statutory obligations, ties transparency to algorithmic ranking systems, creates specific default settings for minors, and brings FTC enforcement power and state enforcement mechanisms to bear — all of which raise operational, privacy, and business model implications for platforms and their vendors.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

The bill defines a covered platform broadly to capture social networks, community forums, online games with user content, messaging apps tied to platforms, and video streaming services that are user‑facing. It sets two age thresholds: a ‘‘child’’ (under 13) and a ‘‘minor’’ (under 17), triggering different duties.

The core legal duty requires platforms to exercise ‘‘reasonable care’’ in creating or implementing any design feature that a reasonable person would conclude foreseeably contributes to specific harms to minors (e.g., eating disorders, clinically diagnosable depression tied to compulsive use, self‑harm, sexual exploitation, substance use, severe harassment, and financial harms). That duty is framed around foreseeability and an objective knowledge standard.

To operationalize the duty, the bill mandates concrete safeguards: platforms must give users they know are minors easy controls to limit who can contact them, restrict public access to their personal data, turn off or limit design features that encourage prolonged engagement (infinite scroll, autoplay, rewards), control personalized recommendation systems (prominent opt‑outs and category limits), and restrict geolocation sharing. For children (under 13), parental tools must be provided and enabled by default, including the ability to view and control privacy settings, restrict purchases, and monitor or limit total time spent.

The bill explicitly bans dark‑pattern interfaces that subvert those safeguards.Large platforms (those averaging over 10 million monthly U.S. active users and providing user‑generated community forums) must commission an independent, third‑party audit at least annually and publish a public transparency report based on that audit. Reports must include counts and engagement measures for users known to be minors, descriptions of commercial interests directed at minors, evaluations of safeguards and parental tools (including non‑English language performance), and plans to mitigate identified risks.

The bill also prohibits product/market research on children and conditions research on minors (ages 13–16) on verifiable parental consent. Finally, Title II requires platforms that use ‘‘opaque’’ algorithms to provide clear notice and let users switch to an ‘‘input‑transparent’’ algorithmic view (one that does not use inferred or historic device/user data), with prohibitions on differential pricing for users who choose the transparent option.

Enforcement is through the FTC (treated as unfair or deceptive acts) and parallel State attorney‑general suits; the Act funds guidance, an age‑verification study, and a Kids Online Safety Council to advise policymakers.

The Five Things You Need to Know

1

The bill establishes a statutory duty of care: platforms must prevent a listed set of harms to minors (eating disorders, substance use, clinically verifiable depression/anxiety tied to compulsive use, compulsive‑use patterns, severe harassment, sexual exploitation, and certain financial harms).

2

For users known to be children (under 13), platforms must enable parental tools by default that let parents view/control privacy settings, restrict purchases, and limit total time spent; parental consent (verifiable) is required for parental tool activation notices and for certain market research on minors.

3

Platforms that predominantly host user‑generated content and averaged over 10,000,000 monthly U.S. active users in the most recent calendar year must conduct independent third‑party audits at least annually and publish public transparency reports based on those audits (de‑identified/aggregated).

4

The bill bans covered platforms from conducting market or product‑focused research on children; research on minors (under 17) is permitted only with verifiable parental consent, and advertising for narcotics, alcohol, tobacco, cannabis, or gambling may not be targeted to users known to be minors.

5

Title II (Filter Bubble Transparency) makes it unlawful to operate an online platform using an opaque algorithm unless the platform (a) gives clear notice that user‑specific data drives recommendations, (b) discloses salient inputs and optimization goals in terms and conditions, and (c) allows users to switch easily to an input‑transparent algorithmic view; differential pricing for choosing the transparent view is prohibited.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 102

Duty of care for foreseeable harms to minors

This section creates the statutory duty: a covered platform ‘‘shall exercise reasonable care’’ when a design feature foreseeably contributes to enumerated harms to minors. The list is specific — mental‑health conditions tied to compulsive use, sexual exploitation, substance or gambling promotion, severe harassment, and fraud‑type financial harms — which frames compliance around risk assessment of product features and their foreseeable effects on young users. The statutory knowledge standard is objective: ‘‘actual knowledge or knowledge fairly implied on the basis of objective circumstances.’"

Section 103

Safeguards and parental tools; defaults and dark‑pattern ban

Section 103 lays out the operational controls platforms must offer: user‑facing safeguards for minors (contact limits, data‑visibility restrictions, limits or defaults on engagement‑driving features, opt‑outs and category limits for recommendation systems, geolocation restrictions), and parental tools with viewing/controlling capability and time and purchase limits. It requires the most protective default for known children and forbids ‘‘dark patterns’’ designed to subvert safety controls. Practical implications include changes to onboarding flows, account defaults, UI/UX design, and purchase flows in games and apps.

Section 104

Disclosure and consent requirements

This section requires clear notices before registration or purchase for users known to be minors, consolidated compliance with COPPA where applicable, and verifiable parental consent for children. Platforms must disclose how personalized recommendation systems operate in terms and conditions, and label advertising and endorsements clearly — guidance from the FTC is authorized to define acceptable formats. These are the information‑law levers the bill uses to make platform practices auditable and understandable to parents and users.

3 more sections
Section 105

Transparency reports and third‑party audits for large platforms

Large U.S. platforms averaging over 10 million monthly active users must commission independent audits and issue annual public reports with a ‘‘reasonable level of assurance.’" Required content includes counts of users known to be minors, mean/median time metrics, language breakdowns, reports received via the platform’s reporting channel, assessments of safeguards, and mitigation plans. Auditors must consult youth experts, use aggregate data, consider recommendation systems, and assess non‑English language efficacy — increasing the compliance footprint for global platforms with U.S. audiences.

Sections 106–109

Research restrictions, studies, guidance, and enforcement

Section 106 bans market/product research on children and requires parental consent for research on minors; Section 107 directs a Commerce/FCC/FTC study on device‑level age verification feasibility and privacy tradeoffs. Section 108 instructs the FTC to publish guidance (including on the knowledge standard) within 18 months, and Section 109 makes violations enforceable as unfair or deceptive practices by the FTC while preserving FTC authority and giving State attorneys general concurrent civil actions to enjoin violations or seek relief. Remedies align with FTC statute-based powers and include civil enforcement by States (with notice and intervention rules).

Title II (Section 202)

Filter‑bubble transparency and user control over opaque algorithms

Title II defines ‘‘opaque’’ versus ‘‘input‑transparent’’ algorithmic ranking systems, restricts the use of user‑inferred historical device data in input‑transparent modes, and requires platforms that use opaque algorithms to give one‑time clear notice, disclose salient inputs/optimization objectives in terms and conditions, and provide an easy toggle allowing users to switch to an input‑transparent algorithm. The provision also bars charging users different prices for choosing transparency and preserves user‑directed blocking features in platforms.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Technology across all five countries.

Explore Technology in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Children and adolescents: the bill forces default protective settings for users known to be children and creates mechanisms (time limits, visibility restrictions, opt‑outs for recommendations) intended to reduce compulsive usage and exposure to harmful content.
  • Parents and guardians: explicit, enabled‑by‑default parental tools, verifiable consent requirements for children, and clear notices about recommendation systems give parents more direct control and visibility into minors’ platform use.
  • Youth advocates, public‑health and research organizations: mandatory audits, public reports disaggregated by language, and a Kids Online Safety Council produce structured data and an institutional forum for translating findings into policy and practice.

Who Bears the Cost

  • Large social platforms and game publishers: compliance obligations (UI changes, default settings, parental tool development), commissioning annual independent audits, publishing transparency reports, and potential litigation exposure under FTC and State enforcement. The >10M‑user threshold concentrates reporting costs on big platforms.
  • Smaller platforms and app developers: while many narrow exceptions exist, platforms ‘‘reasonably likely to be used by minors’’ must adopt safeguards; compliance engineering, legal review, and potential integration with third‑party parental tool vendors will impose costs and product changes.
  • Advertising and market‑research vendors: the prohibition on market/product research on children and the requirement for verifiable parental consent for minors tightens permissible targeting and raises compliance obligations for adtech firms and data processors.

Key Issues

The Core Tension

The central dilemma is protecting minors from foreseeable online harms while avoiding new privacy invasions, excessive compliance burdens, and collateral restrictions on speech and product innovation. Strengthening defaults and limits can reduce exposure to harm, but doing so often requires collecting or inferring more user data, altering engagement models that finance free services, or imposing costly audits — each of which can undermine privacy, competition, or access in different ways.

The bill ties platform obligations to what a ‘‘reasonable and prudent person’’ would have foreseen and to a knowledge standard that includes ‘‘knowledge fairly implied by objective circumstances’’ — language that will force platforms and regulators to litigate or define operational markers for when an account should be treated as a minor. That ambiguity creates both enforcement leverage and compliance uncertainty: platforms can be penalized for failing to act on strong inferences of age, but collecting additional age signals to reduce uncertainty raises privacy and security risks.

Age verification and audits create privacy trade‑offs. The Act explicitly orders a study into device‑level age verification and directs auditors to access platform data, yet also requires de‑identified, aggregated reporting.

Implementing robust age checks or audits risks concentrating sensitive identity data with new intermediaries (age‑verification providers, auditors), increasing attack surfaces and the possibility that privacy harms could follow from privacy‑mitigation efforts. Finally, the bill’s requirement to publish aggregate metrics and auditor assessments raises questions about how to disclose useful transparency without revealing trade secrets or enabling gaming of platform defenses.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.