AB 1709 targets social media and other services that deliver personalized, engagement‑maximizing feeds. It defines “addictive feature” and “addictive feed,” makes platforms verify users’ ages under California’s Digital Age Assurance Act, and requires deletion of accounts and associated personal information for users under 16.
The bill also tightly limits how age‑assurance data may be used (only for eligibility checks, retained minimally, and barred from advertising, profiling, or recommendation systems), authorizes the Attorney General to write implementing regulations and to expand which services are covered, and confines enforcement to public prosecutors with civil penalties assessed per affected minor (penalty amounts are left blank in the draft). For compliance officers and platform operators, the measure creates immediate operational decisions about verification methods, data handling, and product design for minors in California.
At a Glance
What It Does
It requires covered platforms to verify users’ ages under the Digital Age Assurance Act and to delete accounts and personal data for anyone under 16. It defines covered platforms by whether they offer an “addictive feed” and bars using age‑assurance data for advertising, profiling, or algorithmic recommendations.
Who It Affects
Social media companies and any website or app that uses personalized recommendation feeds as a significant part of the service (excluding commerce/review sites and primary cloud‑storage feeds). It also affects third‑party age‑verification vendors, advertisers that rely on behavioral targeting, and California’s Attorney General and local prosecutors.
Why It Matters
The bill ties age verification to a state privacy regime, limits downstream uses of verification data, and gives prosecutors sole enforcement authority — creating a mix of technical, privacy, and enforcement obligations that will require product redesigns, new vendor contracts, and regulator guidance.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
AB 1709 begins by defining the problem it targets: features and feeds designed to maximize engagement in ways that can lead to compulsive use. The bill calls out notifications, endless scrolls, autoplay, and any feature that learns from a user to prolong engagement as “addictive features.” It then describes an “addictive feed” as a sequence of user‑generated media recommended or prioritized for a user based on information associated with that user or their device, but it lists several narrow exceptions — for example, private messages, explicit user requests for specific posts, pure search results that are not persistently associated with a user, or a next item in an already existing sequence by the same author.
The operative compliance obligations are straightforward but consequential. Platforms that meet the statute’s definition of “covered platform” must verify the age of users under the state’s Digital Age Assurance Act and delete accounts and all personal information for any user under 16.
The bill ties age verification to California’s emerging age‑assurance rules rather than inventing a new verification regime; it also subjects the verification process to any additional Attorney General regulations adopted to implement the chapter.To limit secondary harms, the bill restricts how age‑assurance data may be treated: platforms may use it only to decide age eligibility, must retain it only for the minimum time needed to verify age, cannot use it for advertising, profiling, or algorithmic recommendations, and must secure it against unauthorized access. The Attorney General may write rules to implement these requirements and expand the definition of covered platforms if necessary.
Finally, the bill makes the Attorney General, district attorneys, and city attorneys the only parties that may sue to enforce the statute and contemplates civil penalties measured per affected minor; the draft leaves exact penalty amounts blank.
The Five Things You Need to Know
The bill requires covered platforms to verify age under California’s Digital Age Assurance Act and follow any additional Attorney General regulations for that process.
Platforms must delete the account and all personal information for any California resident user who is under 16 years old.
Age‑assurance data may be used only to determine age eligibility, must be retained only for the minimum necessary period, and is explicitly barred from use in advertising, profiling, or algorithmic recommendation systems.
The Attorney General can adopt implementing regulations and expand the definition of “covered platform” to capture services that make addictive features available to minors.
Enforcement is limited to the Attorney General, district attorneys, and city attorneys, with civil penalties assessed per affected minor — the draft leaves the dollar amounts for negligent and knowing violations blank.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Definitions: "Addictive feature" and "Addictive feed"
This section defines the core risk signals the bill targets. “Addictive feature” is framed as any psychologically exploitative tool — notifications, endless scrolls, autoplay, or features that learn from user behavior to prolong engagement. “Addictive feed” captures personalized streams of user‑generated media where selection or prioritization is based on information associated with the user or device. The statute includes a non‑exhaustive set of exceptions (private messages, explicit user requests, certain search results, next‑in‑sequence content, and legal compliance), which will matter when platforms map product features to compliance obligations.
Covered platform scope, exclusions, and user definition
A “covered platform” is any web or mobile service that offers an addictive feed as a significant part of its service, subject to Attorney General regulation. The bill carves out platforms limited to commercial transactions or consumer reviews and those where the feed’s primary purpose is cloud storage. “User” is limited to natural persons who reside in California — a residencedefinition that will affect cross‑border account management and geo‑enforcement choices.
Age verification and deletion requirement
This provision mandates age verification in accordance with the Digital Age Assurance Act and authorizes the Attorney General to add rules. Crucially, it requires platforms to delete accounts and any personal information associated with users under 16. That deletion duty is operationally sweeping — it covers associated personal data and will force platforms to design deletion workflows, retention exceptions, and potential recordkeeping policies to comply without violating other legal obligations.
Limits on use, retention, and security of age‑assurance data
The statute constrains how platforms may handle the personal information they collect for age assurance: use only for eligibility determination, retain only for the minimum time necessary, and prohibit using the data for advertising, profiling, or recommendation algorithms. It also requires reasonable security procedures to protect that data. These constraints will affect vendor contracts, logging practices, and how age‑verification metadata flows into analytics and adtech pipelines.
Attorney General rulemaking and scope adjustments
The Attorney General may adopt regulations in consultation with the e‑Safety Advisory Commission to implement the chapter and may expand or alter what counts as a covered platform if needed to capture services that expose under‑16 users to addictive features. That delegated authority gives the AG substantial flexibility to address emergent product designs, but it also centralizes significant definitional choices in the executive branch rather than in statute.
Enforcement by public prosecutors and civil penalties
Enforcement is limited to civil actions brought by the Attorney General, a district attorney, or a city attorney. The bill contemplates per‑affected‑minor civil penalties for negligent and knowing violations, with the statute instructing courts to weigh platform size, severity and duration of the violation, and good‑faith compliance efforts when setting penalties. The current draft leaves the actual penalty amounts blank, creating a legal uncertainty that regulators or courts would need to fill in later.
This bill is one of many.
Codify tracks hundreds of bills on Technology across all five countries.
Explore Technology in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- California minors (particularly under‑16 users) — by reducing exposure to personalized, engagement‑maximizing feeds and removing accounts of under‑16 users from covered platforms.
- Parents and guardians — who gain a statutory backstop requiring platforms to verify age and delete under‑16 accounts, supporting parental choice over minors' online exposure.
- Privacy and child‑safety advocates — because the bill limits secondary uses of age‑assurance data (no advertising, profiling, or recommendation), reducing data‑driven targeting of minors.
- Competing platforms that design non‑personalized or non‑addictive product experiences — they may gain a competitive edge if larger platforms restrict access or change feed mechanics.
- State enforcement agencies (AG and local prosecutors) — they gain sole authority to bring civil actions, positioning them centrally in oversight and enforcement.
Who Bears the Cost
- Large social media and recommendation platforms — they must implement or adapt age‑verification systems, design deletion workflows for under‑16 users, and segregate age‑assurance data from advertising and recommendation systems.
- Third‑party age‑verification providers — demand for verification services will rise, triggering new contracts, liability concerns, and compliance burdens to meet state rules tied to the Digital Age Assurance Act.
- Advertisers and adtech firms that rely on behavioral profiles — they lose a data signal (age‑assurance data) for targeting and must adjust audience modeling for California users.
- Small developers and startups with personalized feeds — the threshold “significant part of the service” is vague and may force product pivots or costly compliance even for smaller players.
- State and local prosecutors — enforcement responsibility imposes investigatory and litigation workload, especially given per‑minor penalty framing and the need to interpret new statutory definitions.
Key Issues
The Core Tension
The bill pits two legitimate goals against each other: protecting minors from algorithmic, engagement‑driven harms and protecting minors’ privacy and access to online services. Strong age verification and deletion reduce exposure to addictive feeds but require collecting and processing sensitive verification data and may push platforms to block or cripple features for all users, potentially reducing beneficial uses of online communities for teens.
The bill bundles several hard choices into technical definitions and delegated rulemaking. “Addictive feature” and “addictive feed” are intentionally broad and behavioral — the statute relies on concepts like “foreseeably leads to compulsive use” and “significant part of the service,” which will require regulators or courts to draw bright lines about product design. Platforms will need guidance to decide whether a feature triggers coverage or fits an exception (for example, when a personalized feed becomes a next‑in‑sequence experience).
Without that guidance, companies may overcorrect by disabling features or denying service to avoid enforcement risk.
A second implementation challenge concerns age verification itself. The law ties verification to the Digital Age Assurance Act, but effective and privacy‑respecting age checks at scale are nontrivial.
Many robust verification methods (document checks, face recognition) raise their own privacy, security, and fairness concerns; lighter methods (self‑attestation, device signals) may not be sufficiently reliable. The bill attempts to limit harms by restricting use and retention of age‑assurance data, but platforms will still have to collect and temporarily process sensitive data — creating an attack surface for breaches and a need for strict contractual and technical safeguards.
Finally, the draft leaves the civil‑penalty dollar amounts blank and confines enforcement to public prosecutors. That combination creates two uncertainties: the deterrent effect depends on penalties that are unspecified, and private parties (including affected minors or consumer groups) cannot sue directly.
The Attorney General’s broad rulemaking authority mitigates some ambiguity but concentrates policymaking power and puts the onus on the AG to produce timely, detailed guidance or risk uneven enforcement and over‑compliance by industry.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.