Codify — Article

California bill would limit use and require deletion of neural data from brain–computer interfaces

SB 44 would force companies that make brain–computer interfaces available in California to restrict use to the original purpose and delete neural data once that purpose is fulfilled.

The Brief

SB 44 adds Civil Code §1798.122 and targets neural data generated by brain–computer interfaces (BCIs). The statute defines a covered business as any person who makes a BCI available to someone in California, requires that neural data be used only for the purpose it was collected, and mandates deletion when that purpose is accomplished.

The bill extends the California privacy framework to a new category of highly sensitive data. For companies building or offering BCIs, the statute would force concrete changes to product design, retention policies, contracts with processors and researchers, and operational controls — while leaving several key implementation questions (like what counts as neural data and when a purpose is “accomplished”) unresolved.

At a Glance

What It Does

The bill creates a new Civil Code section that (1) defines a brain–computer interface and a covered business, (2) restricts covered businesses to using neural data only for the purpose it was collected, and (3) requires deletion of neural data once that purpose is accomplished.

Who It Affects

Any entity that 'makes available' a BCI to a person in California — manufacturers, device platforms, app providers, and cloud services that package or market BCIs to Californians. It also affects downstream analytics vendors and research partners that receive neural data from those businesses.

Why It Matters

Neural data is treated as a discrete, highly sensitive category under California law; SB 44 imposes a strict purpose-limitation and an affirmative deletion duty that could reshape product lifecycles, data retention architectures, and contractual relationships across the neurotechnology ecosystem.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

SB 44 inserts a short, targeted rule into California’s privacy law: if you make a brain–computer interface available to someone in California, you must limit how you use the neural signals it collects and remove those signals once the reason for collecting them is done. The statute accomplishes this by defining two terms — what counts as a BCI and who counts as a covered business — and then imposing two duties: limit use to the stated purpose and delete when that purpose is completed.

Those duties are operational: covered businesses will need to identify and document the specific purpose tied to each data collection flow, build retention and deletion mechanisms keyed to purpose, and ensure any sharing or vendor relationships do not produce new, incompatible uses. That is a departure from simple notice-and-consent models: the law demands active purpose-mapping and lifecycle controls rather than relying solely on user permission or generalized privacy policies.The text is short and leaves several practical gaps.

It does not define the term "neural data," it does not include timelines or exceptions for backups, research, public safety, or legal holds, and it does not specify technical standards for deletion. Because the statute is framed as furthering the California Privacy Rights Act, enforcement would occur within California’s existing privacy regime, but the statute itself focuses on the substantive duty rather than procedural detail.For product teams and compliance officers, SB 44 means rethinking data architecture: segregate neural data flows, apply purpose-bound access controls, add deletion triggers and attestations, and revise contracts with processors and researchers to prevent downstream repurposing.

For researchers and developers, the law creates friction around longitudinal data sets: continuing research that depends on cumulative neural records will require new legal and operational workarounds or explicit exemptions from the covered business's deletion duty.

The Five Things You Need to Know

1

The bill adds Civil Code §1798.122, specifically targeting neural data collected through a brain–computer interface.

2

It defines 'covered business' as any person who makes a brain–computer interface available to a person in California, giving the rule broad reach to device makers, platform operators, and service providers.

3

The statute requires a covered business to use neural data only for the purpose for which it was collected, creating an affirmative purpose-limitation obligation.

4

The statute requires deletion of neural data 'when the purpose for which the neural data was collected is accomplished' but does not provide a timeline, exceptions, or a definition of 'accomplished.', SB 44 does not define 'neural data' in the text, leaving open which raw signals, derived inferences, or processed summaries fall under the law.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 1798.122(a)(1)

Definition — brain–computer interface

This subsection defines a 'brain–computer interface' as a system that enables direct communication and control between a person's brain and an external device. Practically, the definition is functional and broad: it captures invasive and noninvasive systems so long as they provide direct brain-device channels. The broad phrasing creates uncertainty at the margins — for example, whether wearables that infer cognitive state from peripheral signals qualify — so businesses will need to map their products against the statutory language to determine coverage.

Section 1798.122(a)(2)

Definition — covered business

The bill defines a 'covered business' as anyone who 'makes available' a BCI to someone in California. That phrase pulls in manufacturers, distributors, software platforms, and possibly licensors who enable access to BCIs, including remote service providers who market or operate interfaces for Californians. The operative wording focuses on availability rather than classical domicile or size thresholds, which could bring small or foreign providers into scope if they make BCIs available to Californians.

Section 1798.122(b)

Use limitation tied to collection purpose

This subsection requires covered businesses to use neural data only for the purpose for which it was collected. The mechanics implied are purpose articulation, purpose mapping across systems, and controls to prevent repurposing. The provision forces contractual and technical alignment with processors and downstream recipients: a covered business cannot lawfully permit uses that fall outside the original purpose unless the new use is compatible or the law provides an exception (the text provides none).

2 more sections
Section 1798.122(c)

Deletion when purpose is accomplished

This clause mandates deletion of neural data once the collection purpose is 'accomplished.' It imposes an affirmative lifecycle obligation but leaves the trigger vague: is a purpose accomplished at session end, after a set retention period, or when a user revokes consent? The subsection creates operational requirements (deletion flows, backup handling, attestations) and legal questions about exception handling — for example, whether data necessary for product safety investigations or regulatory compliance may be retained.

Section 2

Legislative finding — ties to CPRA

The Legislature states the act furthers the California Privacy Rights Act, signaling that the new duties are intended to slot into California’s broader privacy enforcement framework. The finding does not itself create procedural rules, but it indicates enforcement and interpretation will be read through the CPRA/CPPA lens, meaning enforcement, penalties, and regulatory guidance could follow existing agency pathways.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Privacy across all five countries.

Explore Privacy in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • California residents using BCIs — they gain a statutory limit on how companies may use and retain their neural signals, increasing control over a uniquely sensitive category of personal information.
  • Privacy and civil-society groups — the law creates a clear legislative stance treating neural data as requiring special handling, strengthening advocacy arguments for tight safeguards.
  • Users seeking ephemeral BCI interactions (e.g., short-term assistive communications) — the deletion mandate supports designs that minimize long-term storage and reduce reidentification risks.

Who Bears the Cost

  • BCI manufacturers and platform operators — they must implement purpose-mapping, build deletion mechanisms (including for backups and derived data), and revise contracts with vendors and researchers, raising engineering and legal costs.
  • Cloud and analytics vendors that process neural data — downstream processors face contractual limits on reuse and may need to implement deletion-on-demand capabilities tied to upstream triggers.
  • Startups and academic researchers relying on longitudinal neural datasets — the deletion requirement may disrupt research designs and product development that depend on cumulative records, forcing new consent models or data governance workarounds.
  • State enforcement bodies and compliance teams — regulators and internal compliance functions will need interpretive guidance and resources to adjudicate disputes about scope, purpose, and deletion triggers.

Key Issues

The Core Tension

The bill pits two legitimate objectives against each other: the need to protect uniquely intimate neural information by limiting use and forcing deletion, versus the need to preserve neural datasets for innovation, safety testing, diagnostics, and research. Tight deletion and purpose rules defend privacy but can fragment datasets essential for improving BCI functions, diagnosing neurological conditions, or training safe models — a trade-off with no easy technical or regulatory fix.

SB 44 takes a clean, minimalist approach: it imposes two bright-line duties but leaves the hard work of interpretation to implementers and regulators. The most immediate ambiguity is the statute's silence on what 'neural data' includes.

Without a textual definition, companies must decide whether to treat raw voltage time-series, feature vectors, behavioral inferences, model weights trained on neural inputs, and aggregated research outputs alike — and those decisions materially affect compliance costs and user privacy.

A second set of challenges concerns the deletion mandate. The statute provides no retention schedule, no exceptions for litigation holds, public-safety needs, product liability investigations, or bona fide research with appropriate safeguards.

It also does not address technical constraints: how to purge data from immutable ledgers, distributed backups, or models that retain information through weights and inferences. Businesses will need to build deletion workflows and document rationales for any retained data, but without statutory safe harbors or standards they face legal uncertainty.

Finally, SB 44 sits at the intersection of multiple regimes (CPRA, medical confidentiality laws, federal rules) without spelling out precedence or interaction. Neural data collected in clinical contexts may trigger medical confidentiality obligations that differ from the commercial-purpose limitations in this bill, raising coordination questions for health-care providers, device makers, and privacy regulators.

These unresolved implementation issues make compliance both technically and legally complex, and they increase the likelihood that the California Privacy Protection Agency or courts will be asked to fill in the operational details.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.