Codify — Article

California AB 656 requires one‑click account deletion on major social platforms

Mandates a clear “Delete Account” control, limits dark patterns, and ties deletion requests to CCPA rights for large social media platforms.

The Brief

AB 656 forces qualifying social media companies to give users an unmistakable, easily reachable way to delete their accounts and walk them through completing that deletion. The measure also bars interference — including so‑called dark patterns — that make it hard for users to leave.

The goal is twofold: reduce friction that keeps people, especially adolescents, tied to platforms that may contribute to harm, and standardize deletion procedures so consumers can actually exercise control over their data. Compliance will matter for platform product, legal, and privacy teams because the bill imposes specific UI and operational requirements rather than high‑level principles.

At a Glance

What It Does

The bill requires a visible account‑deletion control in a platform’s settings and obliges the platform to present the steps necessary to complete deletion when a user clicks it. It forbids obstructive design — identified as dark patterns — that interferes with deletion.

Who It Affects

The rule applies to social media platforms defined under California law that generate over $100 million in annual gross revenue; that makes the obligation squarely targeted at large, commercially significant platforms. Product, engineering, compliance, and customer support teams will handle implementation.

Why It Matters

AB 656 moves UI design into statute, converting a consumer‑protection concern into a tech compliance issue. For privacy officers it also creates a direct operational intersection with California’s privacy law framework and removes some ambiguity about what a reasonable deletion pathway must look like.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

AB 656 adds a new Title 25 to the Civil Code focused on social media account cancellation. The operative definitions section imports existing statutory meanings for terms such as “clear and conspicuous,” “dark pattern,” and “personal information,” and limits application to platforms that meet the statute’s revenue screen.

The core requirement sits in Section 3273.91: platforms must place a clear and conspicuous button labeled “Delete Account” as an immediately visible option within the settings menu, and that settings menu must be reachable from any format a user uses to access the platform (application, browser, etc.). When a user clicks that control, the platform has to present the steps needed to complete the deletion, and those steps must include deleting the user’s personal information.The bill permits platforms to seek verification of the deletion request but constrains the methods: verification must be cost‑effective and easy to use, and acceptable mechanisms include preestablished two‑factor authentication, email, text message, telephone call, or message.

The statute also explicitly forbids obstructive techniques — including those classified as dark patterns under California law — and declares that submitting the on‑platform deletion process counts as a request to delete personal information under California’s consumer privacy statutes.Two operational rules are added to remove ambiguity: logging back into an account after initiating deletion does not by itself cancel the deletion request, and procedural terms that would waive the statute are void. The act includes a severability clause so that one struck provision does not automatically sink the rest of the law.

The Five Things You Need to Know

1

The statute applies only to social media platforms that earn more than $100,000,000 in annual gross revenue.

2

The required control must be a button labeled “Delete Account” placed as an immediately visible option in the platform’s settings menu.

3

When the Delete Account control is clicked, the platform must provide the procedural steps necessary to complete deletion and remove the user’s personal information.

4

If the platform verifies deletion requests, verification must be cost‑effective and may use preestablished 2‑factor authentication, email, text, telephone call, or message.

5

Logging into an account does not, on its own, revoke a pending deletion request.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 3273.90

Definitions and scope

This section borrows existing statutory definitions for key terms — clear and conspicuous, dark pattern, and personal information — and ties the statute’s reach to the social media platform definition in the Business and Professions Code and a $100 million revenue threshold. For implementers, that means you must read the cross‑referenced definitions in 17601 (B&P) and 1798.140 to understand what the terms require.

Section 3273.91(a)

Delete control: label, placement, and accessibility

Subsection (a) specifies the UI mandate: a button with the words “Delete Account” must be immediately visible in the settings menu. The settings menu must be reachable in whatever formats the platform offers (mobile app, browser, etc.). The law moves a UX decision into statutory text, forcing platforms to make a design update rather than relying on voluntary UI conventions.

Section 3273.91(b)

Process when the button is clicked and verification limits

When a user clicks the delete button, the platform must present the steps necessary to complete the deletion, including deleting personal information. Verification is allowed but constrained: it must be cost‑effective and support user convenience. The statute lists acceptable verification channels (preestablished two‑factor authentication, email, text, or telephone), which narrows the universe of plausible verification flows and limits more burdensome proof requirements.

2 more sections
Section 3273.91(c)–(e)

Prohibition on obstruction and CCPA interaction

The statute forbids obstructive techniques — explicitly including dark patterns — that interfere with deletion. It also declares that an account‑deletion request under this statute constitutes a deletion request under the California Consumer Privacy Act and must be processed per the CCPA’s requirements. Finally, the text clarifies that a subsequent login alone does not cancel a submitted deletion request.

Section 3

Severability and unenforceable waivers

Section 3 makes the act severable and declares any waiver of its provisions void as contrary to public policy. Operationally, that prevents platforms from sidestepping obligations via contract terms and makes it likely that if one provision is invalidated in litigation, the rest will survive.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Privacy across all five countries.

Explore Privacy in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Users seeking to leave platforms — The bill reduces procedural friction and ensures a clear control is available across access formats, making account termination predictable and achievable.
  • Adolescents and caregivers — By targeting design features that impede exit, the law aims to shorten the pathway out of potentially harmful use patterns that research links to adverse mental‑health outcomes.
  • Privacy advocates and consumer rights organizations — The statute creates an operational remedy that maps to privacy rights, giving advocates a concrete standard to evaluate platform behavior.
  • Compliance and product teams at covered platforms — While not a benefit in a traditional sense, these teams gain a single, statutory specification to implement rather than a moving set of best practices.

Who Bears the Cost

  • Large social media platforms (> $100M revenue) — They must modify UI/UX, back‑end account workflows, verification procedures, and data‑deletion pipelines to comply.
  • Engineering and customer‑support operations — Implementing reliable, cross‑platform deletion and safe verification systems and handling increased deletion requests will require development and staffing investment.
  • Advertisers and data‑driven units inside platforms — Faster and simpler deletion will likely raise churn and reduce the amount of retained personal data available for ad targeting and measurement.
  • Legal teams and regulators — Ambiguities in definitions and the CCPA interaction will produce dispute and interpretation costs, and businesses may face litigation over compliance and alleged dark patterns.

Key Issues

The Core Tension

AB 656 enshrines user autonomy and harm‑reduction into product design by forcing rapid, low‑friction deletion, but that same low‑friction approach raises legitimate operational and legal concerns: how do platforms stop fraud, preserve evidence for investigations, and honor other lawful retention needs while still making deletion simple and immediate?

The statute is precise on UI placement and permissible verification channels but leaves several operational questions unanswered. It does not specify a compliance deadline, record‑keeping obligations, deletion timelines, or penalties for noncompliance, so enforcement will turn on other California authorities and litigation.

The law also assumes platforms can fully delete personal information on demand; in practice, data is often replicated across backups, analytics stores, and third‑party processors, and the bill does not set rules for timing or exceptions for lawful retention (for example, legal holds or law‑enforcement requests).

Another implementation tension concerns verification and fraud risk. The bill limits acceptable verification methods to common channels to avoid burdensome hurdles, but those same channels can be weaker against account‑takeover or fraudulent deletion attempts.

Platforms must reconcile the statute’s consumer‑friendly verification standard with internal controls to prevent malicious deletions. Finally, the law relies on cross‑references (to the definitions of dark patterns and personal information) that themselves are subject to interpretation, meaning compliance will require careful legal review of related statutes and any implementing guidance.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.