Codify — Article

California AB 2 creates statutory damages for social media-caused injuries to children

Imposes a per-violation damage formula and treble-damage alternative against large social platforms that breach ordinary care and cause injury to minors, aiming to shift costs and incentives.

The Brief

AB 2 adds Section 1714.02 to the California Civil Code and makes qualifying social media platforms financially liable when their breach of the ordinary‑care duty under Section 1714(a) causes injury to a child. The statute offers plaintiffs the larger of (a) $5,000 per violation up to $1,000,000 per child, or (b) three times the child’s actual damages; it also voids any waiver and treats these remedies as cumulative.

The bill targets platforms defined by Business and Professions Code Section 22675 that generate more than $100 million in annual gross revenue. If enacted as written, AB 2 significantly raises the potential exposure for large platforms and changes the litigation calculus by creating a statutory damages path tied to traditional negligence elements (duty, breach, causation).

The measure aims to shift financial responsibility for child injuries from taxpayers and families back to large platforms, but it raises questions about causation, federal preemption (e.g., Section 230), and how courts will treat “per‑violation” calculations and corporate revenue attribution.

At a Glance

What It Does

Creates a new Civil Code provision (1714.02) that makes qualifying social media platforms liable for statutory damages when they breach the ordinary‑care duty under Section 1714(a) and cause injury to a child, offering a choice between a per‑violation statutory schedule and treble actual damages. Waivers of the statute are void and remedies are cumulative with other laws.

Who It Affects

Large social media platforms that meet the definition in Business & Professions Code §22675 and report over $100 million in annual gross revenue, plaintiffs (children and guardians) bringing tort claims, and insurers and compliance teams for covered platforms. Courts will also be affected as they interpret causation, the term “per violation,” and corporate revenue attribution.

Why It Matters

The bill converts what has been a garden‑variety negligence claim into a potentially high‑stakes statutory‑damages regime for major platforms, raising the economic incentives to redesign features, alter moderation policies, or change how services are offered to minors. It creates new leverage for plaintiffs while leaving open difficult questions about proof of causation and conflicts with federal immunities.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

AB 2 does not create a new cause of action; instead, it layers a statutory‑damages mechanism on top of California’s existing negligence rule in Civil Code Section 1714(a). To recover under 1714.02 a plaintiff must still show the basic negligence elements—duty of ordinary care, breach of that duty, and that the breach caused injury to a child—but if those elements are met the statute prescribes the monetary remedy.

The statute offers two alternative computations and awards the larger: a flat‑rate approach ($5,000 per violation capped at $1,000,000 per child) or an amount equal to three times the child’s actual damages.

The bill confines its application to “social media platforms” as defined by Business and Professions Code Section 22675 and adds an explicit revenue threshold—platforms that generate more than $100 million per year in gross revenue. It also includes standard implementation clauses: waivers are void as against public policy; the remedies are cumulative (so plaintiffs can pursue other statutory or common‑law relief); the act is severable; and it applies only prospectively, excluding cases pending on or before January 1, 2026.What the text leaves to courts is substantial.

It ties liability to a breach of Section 1714(a), so litigants will litigate what ordinary care looks like for product design, algorithmic recommendations, age gating, and content‑moderation systems. The statute does not define “per violation,” which invites disputes about whether a violation is a single algorithmic decision, an individual post or message, a continuing course of conduct, or a discrete design choice.

The revenue threshold is likewise silent on whether to measure the platform’s standalone revenues or the parent company’s consolidated figures, a question with major exposure implications.Practically, the statute creates pressure on large platforms to address features and business practices plaintiffs can plausibly tie to child harm because the available damages can dwarf typical negligence awards and increase litigation leverage. Defendants will respond with causation defenses, motions to dismiss on federal preemption or Section 230 grounds, and contests over how to calculate violations and attribute revenues.

For compliance officers and product teams, the law—if enacted—forces a reassessment of risk models for features that affect minors and may trigger product changes, insurance renegotiations, or additional recordkeeping to document safety efforts.

The Five Things You Need to Know

1

The statute attaches liability to a showing that a platform breached the existing negligence standard in Civil Code §1714(a) and that the breach caused injury to a child.

2

A qualifying platform faces the larger of (a) $5,000 per violation with a $1,000,000 cap per child, or (b) three times the child’s actual damages.

3

The definition of covered platforms pulls in the Business & Professions Code §22675 definition and adds a bright‑line revenue threshold: more than $100,000,000 in annual gross revenues.

4

Any contractual waiver of the statute is void, and the statute’s remedies are expressly cumulative with other legal remedies or obligations.

5

The law is prospective only and does not apply to cases pending on or before January 1, 2026; it also contains a severability clause.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 1 (Findings)

Legislative findings and policy purpose

The bill opens with findings that restate the general negligence principle in Civil Code §1714 and assert that children are especially vulnerable on large platforms, that some platform features knowingly harm children, and that those harms’ costs are currently borne by parents, schools, and taxpayers. Those findings frame the statute’s remedial purpose—deterring harmful design and shifting costs back to large platforms—but they have no independent legal effect on proof standards; courts will still require traditional causation and breach proofs.

Section 2(a) — New §1714.02(a)

Statutory damages tied to negligence and causation

Subsection (a) is the operative text: it makes a qualifying social media platform liable for statutory damages when it violates §1714(a) and ‘‘causes injury’’ to a child. The statute prescribes two alternative calculations and awards the larger. Practically, this forces plaintiffs to plead negligence and causation, but if they prevail the monetary remedy can be set by statute rather than purely by proof of actual damages.

Section 2(b) — Waiver

Contractual waivers unenforceable

Subsection (b) declares any waiver of the statute void as contrary to public policy. This prevents platforms from using terms of service to require arbitration, disclaim liability, or extract exculpatory releases from users to avoid the statute’s damages; however, other procedural mechanisms (like mandatory arbitration clauses that survive state review) could still be litigated against federal preemption and unconscionability challenges.

2 more sections
Section 2(c) — Definitions

Who counts as a child and which platforms are covered

Subsection (c) defines ‘‘child’’ as a minor under 18 and imports the statutory definition of ‘‘social media platform’’ from Business & Professions Code §22675, adding a revenue cutoff: the platform must generate more than $100 million per year in gross revenues. That language raises implementation questions about whether revenues are platform‑level or consolidated at a corporate group level and how to treat revenue from mixed‑use services that include but are not limited to social functions.

Section 2(d), Section 3, Section 4

Cumulative remedies, severability, and prospective application

Subsection (d) makes the duties and remedies in 1714.02 cumulative—plaintiffs can seek these damages in addition to relief under other laws. Section 3 establishes severability, preserving the remainder if any part is invalidated. Section 4 limits application to claims that arise after January 1, 2026 (cases pending on or before that date are excluded), an explicit temporal boundary that affects ongoing litigation and settlement strategy.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Technology across all five countries.

Explore Technology in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Children and their guardians — gain an enhanced, statutory damages pathway that increases potential recovery and leverage in settlement negotiations when a platform’s negligence causes injury.
  • Schools and taxpayers — may benefit indirectly if damages and deterrence shift more of the fiscal burden for prevention and treatment from public institutions to covered platforms.
  • Child‑safety and public‑health advocacy groups — receive a stronger enforcement incentive: the statute increases platforms’ financial exposure and could motivate policy or feature changes that align with advocacy goals.
  • Plaintiffs’ attorneys representing minors — stand to see larger recoveries and greater leverage to settle or triage expensive litigation based on predictable statutory formulas.
  • State and local agencies charged with child welfare — may find private enforcement complements regulatory oversight and reduces some fiscal pressure on public services.

Who Bears the Cost

  • Large social media platforms meeting the §22675 definition and $100M revenue threshold — face heightened liability exposure, increased defense costs, potential higher settlement payouts, and pressure to reengineer features or age‑related safeguards.
  • Platform insurers and reinsurers — will likely face higher claims exposure and may increase premiums or narrow coverage for negligent design and moderation failures affecting minors.
  • Product, compliance, and legal teams at covered platforms — will incur compliance and documentation costs, including redesigning algorithms, implementing enhanced age verification, and maintaining records to defend against negligence claims.
  • Courts and the judicial system — may experience an influx of complex fact‑intensive cases about causation, per‑violation calculations, and revenue attribution that increase litigation time and resource demands.
  • Smaller platforms and startups just under the revenue threshold — could bear indirect costs if platforms restructure services, consolidate, or restrict interoperability to avoid triggering the statute, altering competitive dynamics.

Key Issues

The Core Tension

AB 2 confronts a classic policy trade‑off: it aims to protect children by imposing large, predictable financial penalties on major platforms for negligent conduct, thereby incentivizing safer design; but it does so by shifting risk onto platforms where proving causation is difficult, potentially chilling innovation or moderation choices and raising federal preemption and Section 230 conflicts—leaving courts to balance child protection against doctrinal and constitutional limits.

The bill solves one practical political problem—creating a clear financial stick to deter risky platform features—but it leaves several thorny implementation questions unresolved. First, tying liability to Section 1714(a) preserves traditional negligence proof requirements, so plaintiffs must still prove that a platform’s conduct proximately caused the child’s injury.

Establishing proximate cause between an algorithmic design choice or recommendation system and an individual child’s harm is factually complex and will invite expert battles about counterfactuals and product causation. The statute’s ‘‘per violation’’ language compounds that complexity: is a violation a single recommendation, a series of exposures, an ongoing design choice, or a single manifestly dangerous feature?

Courts will have to choose how to translate that language into per‑unit damages calculations, which directly affects caps and aggregate exposure.

Second, the interplay with federal law is unresolved. The statute targets third‑party content ecosystems and platform design choices; defendants are likely to raise preemption and Section 230 defenses that argue federal law limits state tort liability for interactive computer services.

The bill does not amend or reference federal immunity, so litigants and courts will squarely face whether the California statutory damages regime can coexist with, or is displaced by, federal protections. Finally, the revenue threshold and silence about attribution of corporate revenues create practical disputes: should the $100 million test apply to the platform legal entity, a business unit, or a consolidated parent company?

These definitional gaps matter enormously for exposure and will likely generate early discovery fights and motion practice, delaying substantive rulings on the statute’s merits.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.