The bill creates three interlocking controls on social media: it forbids platforms from permitting accounts for anyone under 13 and requires deletion (with a 90‑day portability window) of data from terminated child accounts; it prohibits use of personalized recommendation systems for users under 17 except when only minimal, non‑behavioral metadata are used; and it amends the Children’s Internet Protection Act to require schools that receive E‑rate discounts to certify technology protection measures that block student access to social media on supported networks.
Those changes would force product, data‑governance, and procurement shifts across ad‑funded platforms, adtech ecosystems, and school IT operations. Compliance implicates algorithm design, age‑detection practices, FTC enforcement as an unfair or deceptive practice, and new certification and enforcement duties for the FCC around E‑rate funding — all within a one‑year implementation window after enactment for platform obligations and staged deadlines for schools seeking discounted broadband support.
At a Glance
What It Does
The bill bars social media platforms from allowing accounts for individuals under 13 and requires immediate deletion of a terminated child’s personal data (with a 90‑day requestable copy). It also forbids platforms from using personalized recommendation systems based on personal data for users under 17, allowing only a short list of non‑behavioral metadata. Separately, it amends the Children’s Internet Protection Act to condition E‑rate discounts on school certification that technology protection measures block student access to social media on supported networks.
Who It Affects
Consumer‑facing, ad‑supported platforms that collect personal data and host user‑generated content; adtech and personalization vendors that feed recommendation systems; elementary and secondary school districts that apply for E‑rate discounts; and regulators — primarily the Federal Trade Commission and the FCC — plus state attorneys general with parens patriae authority.
Why It Matters
This is a structural intervention into the algorithmic economics that power ad‑supported social platforms: it curtails behavioral personalization for minors and forces immediate data deletion for under‑13s. It also ties school broadband subsidies to active blocking and monitoring policies, creating a procurement and compliance burden for districts and a new enforcement nexus between the FTC and FCC.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The bill defines a narrow set of platforms it intends to cover: public‑facing sites, apps, or services that collect personal data, rely primarily on advertising or data sales for revenue, and whose main purpose is to host user‑generated content for viewing or resharing. The definition includes multiple explicit exclusions — enterprise tools, purely cloud storage, email, videogames, and teleconferencing that uses unique links — so many non‑social services are outside the rule.
Under the core consumer protections, platforms may not permit accounts for anyone the operator knows is under 13 and must terminate any existing accounts the platform knows belong to children. After termination, platforms must immediately delete personal data collected from the child, but they must also provide, if requested within 90 days, a readable and machine‑portable copy of that data.
Platforms may retain only the minimal record necessary to show they terminated the account.For older minors, the bill removes a central algorithmic lever: platforms cannot use personal data in personalized recommendation systems to display content to users the platform knows are under 17. The statute narrowly allows personalization that relies solely on non‑behavioral fields — device type, language, city/town, the fact the user is a child or teen, or the user's age — but bars behavioral signals and inferred interests.
The bill also clarifies that explicit searches and chronological displays are not treated as banned personalization.On implementation mechanics, the bill sets a knowledge standard that means liability reaches operators who have actual knowledge or knowledge fairly implied by objective circumstances; the FTC or state attorneys general must rely on competent, reliable evidence and consider the totality of circumstances. Platforms are not required to build age‑verification systems or to collect age data beyond what they already collect in the normal course of business; however, if they voluntarily collect age data to comply, they must limit use and retention of that data to the compliance purpose.Enforcement is dual: the Federal Trade Commission will treat violations as unfair or deceptive acts or practices and enforce the statute under its existing authorities; state attorneys general may bring parens patriae actions but must notify the FTC and allow intervention.
Separately, the bill updates the Children’s Internet Protection Act so that schools seeking E‑rate discounts must certify they operate technology protection measures that block student access to social media on supported networks, submit copies of internet safety policies to the FCC, and meet short deadlines or risk losing discounts — though limited waivers and staged compliance windows are available for procurement constraints.
The Five Things You Need to Know
Platforms must terminate accounts they know belong to children under 13 and immediately delete the child's personal data—while allowing that child, within 90 days, to request a readable and machine‑readable copy of their data.
The ban on personalized recommendation systems covers individuals under 17; the only allowed inputs for any permitted recommendations are device type, language, city/town, the fact the user is a child or teen, or the user's age.
The bill defines 'knows' as actual knowledge or knowledge fairly implied from objective circumstances and requires regulators to rely on competent, reliable evidence when making that determination.
The Federal Trade Commission enforces the statute by treating violations as unfair or deceptive acts or practices under the FTC Act, while state attorneys general may sue parens patriae after notifying the FTC and providing the complaint.
Schools seeking E‑rate (section 254(h)) discounts must certify — on a tight timeline with a two‑year compliance window and limited waivers — that they operate and monitor technology protection measures blocking student access to social media on supported devices and networks.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Definitions: who and what the law covers
This section sets the operational scope by defining 'personalized recommendation system', 'child' (<13), 'teen' (13‑16), 'user', and — critically — 'social media platform.' The platform definition is four‑pronged (public‑facing, collects personal data, primarily ad/data revenue, and primarily a user‑generated content forum) and includes a long list of exclusions (email, ISP service, teleconferencing via unique links, cloud storage, videogames, certain educational non‑commercial services, messaging not integrated into a platform). Practically, that means many ad‑supported social products fall squarely inside the law, while enterprise tools and purely educational systems can be carved out, but platforms will need to map their features to each prong to determine coverage.
No accounts for children under 13; deletion and portability
Platforms may not permit account creation or maintenance for anyone the operator knows is under 13 and must terminate accounts they do know are children’s. Upon termination, platforms must delete all personal data collected from the user immediately, but must also provide, if requested within 90 days, a copy of that data in both human‑readable and portable machine‑readable formats. The provision permits platforms to keep only a minimal record of termination to demonstrate compliance. For operators, the clause creates a trade‑off between conservative age‑blocking and avoiding wrongful account terminations; it also raises practical questions about identity verification, false positives, and how operator logs satisfy the 'minimal record' carve‑out.
Ban on behavioral personalization for under‑17s, limited metadata exception
This section bars platforms from using personal data in personalized recommendation systems to display content to users the operator knows are under 17, while permitting systems that rely only on five narrow data elements (device type, language, city/town, whether the user is a child or teen, and the user's age). The bill explicitly preserves the ability to return search results a minor requests and to show chronologically ordered content a minor has subscribed to. For algorithm teams, this will likely require plumbing changes to ensure personalized ranking models exclude behavioral signals for accounts flagged as minors; adtech firms that feed behavioral features will face gating or suppression logic.
Knowledge standard, privacy limits on voluntary age collection
Regulators are instructed to determine 'knowledge' based on actual knowledge or knowledge fairly implied from objective circumstances, using competent and reliable evidence and the totality of the circumstances. Importantly, the statute prohibits compelling platforms to implement age‑verification or to collect age data beyond normal business needs. If a platform voluntarily collects age data to comply, it must restrict that data’s use exclusively to compliance and retain it only as long as minimally necessary. That combination is aimed at limiting privacy harms from broad age‑verification systems while still allowing enforcement where the operator could reasonably be expected to know the user’s age.
Enforcement by FTC and State AGs; limited preemption
Violations are treated as unfair or deceptive acts or practices under the FTC Act, giving the FTC its full enforcement toolkit. State attorneys general may file parens patriae suits on behalf of residents but must provide the FTC notice and a copy of the complaint; the FTC may intervene. The bill preserves other FTC authorities and preempts conflicting state law only where conflict exists, while permitting states to adopt stronger protections. For compliance teams, expect coordination demands if both the FTC and state AGs bring actions and an emphasis on recordkeeping and demonstrable evidence about what the operator 'knew.'
CIPA update and E‑rate conditioning: schools must block social media
Title II amends the Children’s Internet Protection Act to explicitly treat social media platforms in the CIPA framework and ties E‑rate discounts to school certifications that they enforce technology protection measures blocking student access to social media on supported services, devices, and networks. Schools must submit certifications and copies of internet safety policies to the FCC, which must open rulemaking within 120 days. The statute provides a staged timeline: initial certification deadlines in the first applicable E‑rate funding year (with 120 days to certify), a two‑year build‑out window for schools without measures in place, and a limited waiver process for procurement barriers—failure to certify or comply can lead to loss or repayment of discounted funds.
This bill is one of many.
Codify tracks hundreds of bills on Technology across all five countries.
Explore Technology in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Children under 13 — the bill removes platform accounts for this group and requires deletion of their personal data, limiting direct data collection and exposure on ad‑driven social platforms.
- Teens (13–16) — they gain protection from algorithmically personalized recommendation feeds driven by behavioral profiling, reducing exposure to algorithmic amplification of potentially harmful content.
- Parents and privacy advocates — clearer statutory limits on data use for minors, data deletion obligations, and the portability window improve transparency and control over children’s online footprints.
- Schools and school communities — while imposing requirements, the law clarifies expectations around blocking social media on school networks and creates a public database of internet safety policies, giving districts a common compliance framework.
- Non‑ad‑supported platforms and subscription services — firms that don’t rely on behavioral ad targeting face fewer competitive disruptions from the restrictions and may find a relative advantage.
Who Bears the Cost
- Ad‑supported social media platforms and adtech vendors — they must redesign recommendation architectures, suppress behavioral signals for minors, add logging to show compliance, and potentially lose ad revenue derived from targeting minors.
- School districts and IT departments — districts must procure, configure, and monitor technology protection measures, revise internet safety policies, submit certifications to the FCC, and manage procurement timelines; underfunded districts risk losing E‑rate discounts if they can’t comply.
- Compliance, privacy, and engineering teams at platforms — implementation will require policy, product, and technical work across identity, data retention, algorithm, and legal teams, creating material operational costs.
- Researchers and educators using personalized delivery — bans on personalization may limit certain adaptive educational or research applications that rely on behavioral signals, requiring alternative approaches.
- State attorneys general and FTC — enforcing subjective standards like 'knowledge fairly implied' and auditing algorithmic behavior will demand investigative and technical resources.
Key Issues
The Core Tension
The bill balances two legitimate goals that pull in opposite directions: protecting minors from data collection and opaque algorithmic influence versus avoiding heavy compliance and operational burdens that can lead to overblocking, lost educational functionality, and displaced user behavior; resolving that tension requires choices about verification, acceptable technical complexity, and how far regulation should reshape the economic model that funds many social platforms.
The bill confronts a difficult implementation problem: age is inherently hard to determine online. Because the statute stops short of requiring universal age verification, platforms will face a practical choice between conservative account removal (risking wrongful exclusion of teens) or permissive allowances that raise enforcement exposure.
That trade‑off risks both under‑inclusion (platforms failing to block minors) and over‑inclusion (blocking older users or removing content erroneously). The portability and deletion rules mitigate some harms but create operational workstreams — verification, export formatting, data deletion audits — that platforms must build quickly.
On the algorithm side, prohibiting recommendation systems for under‑17s while preserving allowed metadata forces algorithmic segmentation and new data governance boundaries. Platforms may respond by moving personalization out of the product (e.g., pushing discovery to direct search, off‑platform messaging, or third‑party apps), or by altering product definitions to avoid the 'social media platform' label.
Tying school broadband subsidies to active blocking shifts compliance costs to districts; underfunded schools may struggle with procurement cycles or capability to monitor and certify filtering tools, potentially widening digital equity gaps. Finally, enforcement requires technical proofs: regulators will need access to model inputs, logs, and feature pipelines to show prohibited personalization occurred or that operators 'knew' a user's age, a fact‑intensive undertaking that raises evidentiary and transparency questions.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.