SB 771 makes qualifying social media platforms subject to civil penalties when their algorithms or coordinated conduct facilitate violations of California’s civil‑rights laws prohibiting violence, intimidation, and coercion (Civil Code §§ 51.7, 51.9, 52, 52.1). The bill targets platforms that generate more than $100 million in annual gross revenue and allows a prevailing plaintiff to recover penalties scaled to the defendant’s culpability.
The measure treats algorithmic distribution as an act by the platform and establishes a presumption that platforms have actual knowledge of how their algorithms operate. Penalties run up to $1 million for intentional violations and $500,000 for reckless conduct, with courts authorized to double awards when the victim is a minor.
The statute becomes operative January 1, 2027, and declares waivers void and provisions severable.
At a Glance
What It Does
SB 771 adds a private‑enforcement civil‑penalty cause of action to the Civil Code for social media platforms whose algorithms, aiding, abetting, or concerted actions bring about conduct that violates specified civil‑rights statutes. It treats algorithmic relay as an act of the platform and presumes platform knowledge of its own algorithms.
Who It Affects
The law applies to platforms defined under Business and Professions Code §22675 that earn over $100 million annually — in practice, U.S. and multinational social networks and major platform subsidiaries. Plaintiffs alleging violations of §§51.7, 51.9, 52, or 52.1 would be able to seek civil penalties in state court.
Why It Matters
The bill shifts liability focus from individual users to platform design and distribution decisions: algorithms that amplify or target harmful content can trigger direct platform penalties. For legal and compliance teams the statute creates new exposure tied to product design, documentation, and discovery of algorithmic systems.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
SB 771 creates a statutory path for private plaintiffs to seek civil penalties from large social media platforms when platform conduct — including algorithmic delivery — contributes to civil‑rights violations listed in the Civil Code. The law does not amend the underlying civil‑rights provisions; instead it adds a supplemental remedy: a court‑awarded penalty, beyond other available relief, intended to deter future violations.
The statute applies only to platforms meeting the bill’s size threshold; it borrows the platform definition in the Business and Professions Code and adds a $100 million gross‑revenue floor. Liability is triggered in several ways: when the platform’s algorithm “relays” content in a way that effects prohibited conduct, when the platform aids or abets or conspires in such conduct, or when it is a joint tortfeasor.
The bill distinguishes mental states for penalty sizing — intentional/knowing, reckless — and authorizes a court to increase penalties if the plaintiff is a minor.Two drafting choices stand out and will shape litigation and compliance. First, the bill explicitly treats algorithmic distribution as an act of the platform independent of the user message: that frames causation around design and deployment choices rather than only individual speaker intent.
Second, the statute contains a built‑in evidentiary presumption that a platform has “actual knowledge” of how its algorithms operate, which shifts the practical burden onto platforms to explain and defend algorithmic systems during litigation. Those features will drive discovery fights over source code, logs, models, and internal decision‑making.Procedurally, the remedy is a civil penalty awarded to a prevailing plaintiff; the bill does not create a new private right to damages per se nor does it specify criminal sanctions.
The law also contains conventional severability and anti‑waiver clauses and is slated to take effect January 1, 2027, giving platforms and litigators time to prepare for a new category of state civil exposure tied to product behavior and algorithmic outcomes.
The Five Things You Need to Know
The bill applies only to social media platforms as described in Business & Professions Code §22675 that generate more than $100,000,000 in annual gross revenues.
It creates civil‑penalty liability for platforms whose algorithms or coordinated actions cause violations of Civil Code §§51.7, 51.9, 52, or 52.1 — covering violent or intimidation‑based interference with protected classes.
Penalties are tiered by culpability: up to $1,000,000 for intentional/knowing violations and up to $500,000 for reckless violations.
The statute treats algorithmic content delivery as an act by the platform and presumes the platform has actual knowledge of how its algorithms operate; courts may double penalties if the plaintiff was a minor.
The law becomes operative January 1, 2027, and contains severability and a provision declaring any waiver of its protections void.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Findings and purpose: extend civil‑rights protections to digital platforms
This prefatory section summarizes legislative findings about the rise in online hate and frames the bill as a tool to ensure civil‑rights protections apply in the digital sphere. Practically, those findings are not operative law but indicate legislative intent: courts will read the statute as aimed at platform conduct that contributes to targeted threats and harassment, especially against historically marginalized groups. The section signals the Legislature’s motivation, which can guide statutory interpretation when courts weigh ambiguous terms like “aids” or “acts in concert.”
Scope: which platforms are covered
This section imports the Business & Professions Code’s definition of a social media platform and adds a $100 million annual gross‑revenue threshold. The combined test narrows the statute to large, commercially significant platforms and excludes smaller services and most niche forums. For compliance teams this creates an immediate revenue‑based bright line for coverage, but organizations near the threshold will need revenue tracking and legal review to determine applicability.
Substantive liability, standards, and penalties
This is the operative provision: it establishes that platforms are liable — in addition to any other remedies — to a prevailing plaintiff for civil penalties when they violate specified civil‑rights statutes through algorithms, aiding/abetting, concerted action, conspiracy, or joint tortfeasorship. The section sets penalty caps tied to mental state, authorizes doubling when the plaintiff is a minor, and explicitly allows courts to consider algorithm deployment an independent act. Practically, plaintiffs will plead algorithmic amplification or targeting as part of causation and rely on the knowledge presumption to require platforms to disclose internal algorithmic mechanics and decision logs in discovery.
Operative date, severability, and anti‑waiver rule
The statute takes effect January 1, 2027, giving covered platforms time to assess exposure. The severability clause means invalidation of any part won’t necessarily sink the remainder. The anti‑waiver clause bars private contractual waivers of these rights as against public policy, which limits platforms’ ability to shift risk to users through terms of service. That clause will figure in any defense relying on user agreements, and it reduces reliance on private contracting as a compliance lever.
This bill is one of many.
Codify tracks hundreds of bills on Civil Rights across all five countries.
Explore Civil Rights in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Members of protected classes targeted with threats or intimidation: The statute gives individuals who suffer violence‑or‑intimidation based on protected characteristics a path to seek civil penalties from platforms that materially contributed to the harm.
- Minors who are victims: Courts may double penalties when plaintiffs are minors, creating stronger deterrence where youth are targeted and signaling legislative prioritization of child safety online.
- Civil‑rights organizations and impact litigators: Groups that bring representative or test litigation gain a new statutory tool to press platform accountability and to obtain penalties that are expressly designed to deter future conduct.
Who Bears the Cost
- Large social media platforms (>$100M revenue): Companies meeting the revenue and platform definition face new civil exposure, potential multi‑million dollar penalties, increased discovery obligations, and the need for compliance programs documenting algorithmic decisions.
- Platform engineering and compliance teams: The presumption of platform knowledge and likely discovery burdens mean engineers and product managers will invest time and resources to produce documentation, explanations, and possibly alter models and ranking systems.
- State courts and judiciary: The statute invites technically complex civil litigation about algorithms, causation, and intent, increasing demands on court resources and expert testimony, and producing potentially prolonged discovery disputes over trade secrets.
Key Issues
The Core Tension
The central dilemma is straightforward: the Legislature aims to protect vulnerable Californians by treating platform distribution choices as actionable conduct, but holding platforms civilly liable for algorithmic outcomes risks imposing ambiguous duties, heavy discovery and compliance costs, and conflicts with protections that historically insulated platforms from responsibility for user speech — a trade‑off between stronger deterrence and the practical and legal burdens of policing algorithmic information flow.
SB 771 leans heavily on two legal and practical levers: treating algorithmic delivery as a platform act and presuming a platform’s actual knowledge of its algorithms. Those choices simplify plaintiffs’ pleading burden but transfer a substantial evidentiary and compliance load onto platforms.
Expect contested discovery over model code, training data, logs, and internal decision‑making; platforms will push trade‑secret and proprietary‑information defenses, triggering protective orders and possibly special procedures for handling sensitive technical evidence.
Another unresolved implementation problem is causation. The statute allows liability for algorithms that “relay” content resulting in prohibited conduct, but proving the link from an abstract ranking signal to real‑world intimidation or violence will be fact‑intensive.
Plaintiffs will need to combine technical logs, user targeting data, and contemporaneous platform communications to show that algorithmic choices materially contributed to the harm. The bill also creates a strong incentive for plaintiffs to seek penalties rather than traditional damages, potentially encouraging filings aimed at corporate deterrence rather than individual remediation.
Finally, while the bill specifies state civil penalties and an anti‑waiver rule, it is silent on interaction with federal doctrines (e.g., platform immunities) and does not create new criminal liability — gaps that will be litigated and may affect enforceability.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.