SB 836 amends the Children’s Online Privacy Protection Act to widen the statute’s scope and tighten operational rules for online services that interact with minors. It explicitly defines “teen” (ages 13–16), expands the set of data considered personal information, bans individual‑specific advertising to children and teens (with narrow exceptions), requires verifiable consent from teens as well as parents in many contexts, and creates deletion, notice, and cross‑border transfer limits.
The bill matters because it reshapes how operators, ad tech firms, mobile app stores, and ed‑tech vendors can collect, use, and retain minor‑linked data. Compliance will require product, legal, and ad‑serving changes (and likely new contracts with schools), while the FTC and GAO receive new reporting and study obligations intended to surface enforcement gaps and fintech risks for teens.
At a Glance
What It Does
The bill revises COPPA definitions, adds a ‘teen’ category (13–16), expands the reach of covered ‘operators’ to online and mobile applications and connected devices, and designates a broad set of identifiers and biometric signals as personal information. It prohibits collecting, using, disclosing, or retaining children’s and teens’ personal data for individual‑specific advertising and requires verifiable consent for material departures from disclosed uses.
Who It Affects
Consumer‑facing app developers, mobile platforms, ad networks, measurement vendors, ed‑tech providers that contract with schools, and connected‑device manufacturers. State attorneys general and the FTC will oversee enforcement and must produce new annual and multiyear reports and guidance.
Why It Matters
The measure closes technical and age‑range gaps in current law, limits a principal revenue model for many free apps (individualized ads to minors), and creates affirmative rights for teens to review and delete data. It also creates regulatory and contractual pathways for schools to authorize certain data flows—shifting compliance burdens onto vendors and platform intermediaries.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
SB 836 rewrites key COPPA building blocks. It replaces and expands several statutory definitions so that 'operator' covers commercial websites, online services, online applications, mobile applications, and services delivered via connected devices.
It enlarges the statute’s concept of 'personal information' to include persistent identifiers (IP, device IDs), geolocation at the street/city level, biometrics, and media files containing a specific child or teen, while carving narrow operational exclusions for ephemeral audio files used only to fulfill a verbal request.
On substance, the bill makes it unlawful for an operator directed to children—or that reasonably should know a user is a child or teen—to collect or use personal information in ways that violate newly prescribed regulations. The most consequential operational prohibition is a ban on 'individual‑specific advertising to children or teens,' defined to cover profiling and targeting based on identifiers or grouped characteristics, with limited carveouts for contextual ads, certain measurement activities, and age‑appropriate advertising that relies only on whether a user is under 17.SB 836 also creates new consent and control mechanics.
It updates 'verifiable consent' standards, extends rights to teens (13–16) to get disclosures, delete data, and challenge inaccuracies, and permits schools to enter written agreements with operators so that verifiable consent need not be obtained where the contract limits use to educational purposes and includes review/deletion mechanisms. The FTC must study whether a common verifiable consent mechanism is feasible and report its findings to Congress; if feasible, the agency will issue implementing regulations.Finally, the bill adds practical limits: operators must notify parents or teens before storing or transferring minor data outside the U.S., they cannot retain minor data longer than reasonably necessary to fulfill the transaction (absent legal exceptions), and the FTC must publish periodic enforcement and oversight reports.
The statute preserves state authority to adopt stronger protections, requires the FTC to issue guidance about when an operator 'fairly should know' a user is a minor, and demands analysis of regulatory impacts on small entities.
The Five Things You Need to Know
The bill defines 'teen' as anyone who has attained age 13 and is under age 17 (i.e.
ages 13–16) and extends many COPPA rights to that group.
It prohibits operators from collecting, using, disclosing, or maintaining personal information of children or teens for 'individual‑specific advertising'—targeting that relies on identifiers, profiling, or cohort attributes—while preserving contextual ad and certain measurement exceptions.
Operators must provide parents (for children) or teens (for teens) with clear notice, the method by which data was collected, deletion rights, a mechanism to challenge inaccuracies, and a path to refuse further collection or maintenance of personal data.
Educational agencies may authorize operators via written agreements that limit data use to educational purposes and give schools the ability to review, prevent further collection, and delete student data, allowing narrow exemptions from verifiable consent requirements.
The FTC must assess a 'common verifiable consent' system and issue guidance on when an operator 'fairly should know' a user is a minor; the GAO must study fintech products’ privacy and teen mental‑health risks and report to Congress.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Expands covered operators, personal data, and adds 'teen'
This section broadens who counts as an 'operator' to include commercial websites, online services, online applications, mobile applications, and services on connected devices, and it excludes only nonprofit organizations that fall under FTC Act exemptions. It dramatically widens 'personal information' to cover persistent identifiers (cookies, IP addresses, device IDs), geolocation at street/city level, biometric identifiers, and media files with a child's or teen's image/voice. Critically, it introduces 'teen' (ages 13–16) as a statutory class, and it defines 'individual‑specific advertising to children or teens' with express exclusions for contextual ads and certain measurement activity.
Bans individualized targeting, mandates notice, consent, deletion rights
This core amendment makes it unlawful for operators directed to minors—or that have actual knowledge or knowledge fairly implied from objective facts—to collect or use minors’ data in ways that violate regulations. The bill forbids individual‑specific advertising to minors (with limited carveouts), requires verifiable consent before materially changing uses, and gives teens direct rights to disclosures, deletion, correction, and obtaining copies of data. It limits retention to the period reasonably necessary to fulfill a transaction or service and requires pre‑transfer notice when data is stored or moved outside the U.S.
Permits school agreements as narrow alternative to parental consent
The bill authorizes the FTC to allow operators acting under written agreements with educational agencies or institutions to proceed without verifiable consent if the contract restricts data collection/use to educational purposes, gives the school review/deletion tools, and identifies an authorized school representative. Mechanically, this creates a contract‑based compliance route for ed‑tech vendors but ties legal safe harbor to stringent contractual limits and operational transparency to parents and students.
Safe harbors unchanged in scope but OTC requires published reports
Safe harbor programs remain part of the regime, with the text clarifying that safe harbor submissions and related documentation the FTC requires may be published online—subject to existing FTC disclosure limits. Practically, reviewers in approved programs will need to consider the expanded definitions and teen protections when certifying compliance, and operators should expect more public visibility into safe‑harbor documentation.
Guidance and analytic obligations for enforcement and small business impacts
The bill instructs the FTC to issue guidance within 180 days explaining how to determine when an operator has knowledge 'fairly implied on the basis of objective circumstances' that a user is a child or teen, while explicitly saying agencies cannot force age‑gating. It also requires regulatory flexibility analysis describing impacts on small entities and clarifies which federal banking and other agencies exercise oversight for specific institutions, allocating administrative responsibilities for enforcement.
FTC oversight reports and a GAO study on teen fintech privacy and mental health
The FTC must provide an oversight report within three years evaluating platform processes for ensuring apps directed to children comply with COPPA and relevant unfair‑practices rules, and must deliver annual enforcement reports detailing investigations, actions, and complaints. The GAO (Comptroller General) must study teens’ use of fintech products—expanded to include mental‑health risks—and recommend whether current law adequately protects teens, with a one‑year reporting deadline.
This bill is one of many.
Codify tracks hundreds of bills on Privacy across all five countries.
Explore Privacy in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Children and teens (ages 13–16): gain expanded statutory protections—explicit inclusion of teens, stronger notice, deletion and correction rights, and protection from individualized ad targeting.
- Parents and guardians: receive clearer, standardized disclosures and new mechanisms to require deletion or block further collection for children, increasing oversight of operators and ed‑tech vendors.
- Educational agencies and schools: obtain a contract path to authorize educational data processing under strict limits, giving schools an affirmative compliance tool to control vendors used in classrooms.
- Privacy‑focused developers and vendors: those that already avoid profiling and keep short retention windows will face fewer business model disruptions and may gain competitive advantage as market preferences shift toward privacy.
Who Bears the Cost
- Ad tech firms and programmatic advertisers: must revisit targeting stacks for minors, remove or segregate identifier‑based targeting for under‑17 users, and adapt measurement techniques to fit allowed exceptions—potentially reducing ad revenue tied to minor audiences.
- App developers and platforms (including small developers): face new compliance obligations (expanded notices, deletion interfaces, cross‑border notices) and uncertainties around when a user 'fairly should be known' to be a minor, increasing legal and engineering costs.
- Ed‑tech vendors: must accept stricter contractual requirements when operating under school agreements, implement review and deletion tooling, and potentially lose access to advertising revenue unless they limit use strictly to educational purposes.
- FTC and state attorneys general: shoulder expanded reporting, guidance, and enforcement responsibilities; the bill also requires regulatory flexibility analysis, adding administrative workload and litigation risk tied to novel definitions.
Key Issues
The Core Tension
The central dilemma is protecting minors’ privacy and agency without imposing unworkable verification requirements or destroying legitimate educational and content‑service models: the bill seeks to eliminate profiling and targeted ads for kids while preserving contextual advertising and school‑authorized educational data flows—forcing a choice between robust privacy protections that disrupt ad‑funded services and looser rules that leave profiling and data retention largely intact.
SB 836 tightens protections but leaves several practical questions unresolved. The statute introduces the legal standard 'knowledge fairly implied on the basis of objective circumstances' and directs the FTC to issue guidance, but it explicitly prohibits mandating age‑gating or active age collection.
That creates an implementation tension: how should operators balance avoiding intrusive verification with the need to demonstrate they did not 'fairly' know a user was a minor? The guidance will inform enforcement risk, but until it exists operators will face uncertainty about acceptable practices for UIs, app‑store metadata, or features that signal intended audience.
The ban on 'individual‑specific advertising' is broad but contains exceptions for contextual ads, measurement, and household devices when ads target an adult profile. Translating that definition into compliant ad stacks is technically complex: ad exchanges and measurement vendors must segregate signals, redesign fingerprinting practices, and ensure cookies/device IDs tied to minors are not re‑used for profiling.
Similarly, the bill requires deletion rights and caps retention to what is 'reasonably necessary'—a fact‑intensive standard that will force operators to map data flows, draft retention justification, and implement deletion tooling, while preserving narrow exceptions for law enforcement, fraud prevention, or legal retention obligations.
Finally, the educational‑agreement carveout gives schools an operational way to authorize data collection for pedagogical uses, but it shifts responsibility to contract terms and school representatives. That dynamic may benefit larger districts with procurement capacity while imposing a compliance burden on smaller districts and vendors.
Requiring the FTC to analyze a common consent mechanism and small‑entity impacts helps, but does not eliminate short‑term ambiguity or the potential for uneven enforcement across states that choose to adopt stronger rules.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.