Codify — Article

California bill expands CCPA definitions to add neural data and tighten contractor controls

SB 435 amends the CCPA’s definitions to broaden sensitive personal information, constrain service providers and contractors, and set specific deidentification, consent, and auditing rules that change compliance mechanics for data-driven businesses.

The Brief

SB 435 revises Section 1798.140 of the California Consumer Privacy Act by enlarging the statute’s core definitions. It explicitly adds neural data to the sensitive personal information category, expands the listed forms of biometric information, clarifies what counts as precise geolocation and probabilistic identifiers, and refines when information is “deidentified” or “aggregate.”

The bill also imposes tighter contractual rules on contractors and service providers — including express prohibitions on combining data, certification requirements, and a mandatory ability for businesses to monitor compliance (including audits at least once every 12 months). SB 435 further tightens consent language (disallowing dark patterns) and places specific conditions on research uses, pseudonymization, and transfer exceptions for mergers and acquisitions.

Those changes recalibrate compliance priorities for businesses, vendors, and research entities that handle California residents’ data.

At a Glance

What It Does

The bill updates many CCPA definitions: it adds neural data to sensitive personal information, broadens biometric categories, and defines precise geolocation as within a circle of 1,850 feet. It imposes contract terms on contractors and service providers that forbid resale, combining data across sources, or use outside specified business purposes and requires certification and monitoring rights.

Who It Affects

Digital advertisers, ad-tech vendors, platforms that process biometric or sensor data, cloud and analytics service providers, health-tech and wearable companies, and any businesses that rely on combining datasets for profiling or cross-context targeting will see direct impact. Research organizations using commercial datasets and companies engaged in mergers or data transfers must also adjust practices.

Why It Matters

The changes close gaps that previously allowed some reuses, cross-context targeting, and chaining of identifiers; they create concrete operational obligations (contracts, audits, monitoring) and add new categories of especially sensitive data (neural data). For compliance officers, the bill changes both what data must be guarded and how vendor relationships must be documented and monitored.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

SB 435 rewrites and supplements the CCPA’s definitions section to make two kinds of changes: (1) expand what counts as sensitive or identifying data, and (2) specify controls and contractual mechanics for third parties that touch personal information. On the data side, the bill places neural data squarely into the sensitive personal information bucket and enlarges biometric information to explicitly include behavioral signals such as keystrokes, gait, and sleep or exercise data when those signals can identify someone.

It also defines precise geolocation by a fixed radius (1,850 feet), and recognizes probabilistic identifiers — persistent signals that can identify a person or device more likely than not.

On the vendor-and-contract side, the bill tightens the definition of a contractor and service provider by spelling out contract terms that must prohibit resale, retention or use outside specified business purposes, and combining data from multiple clients. Contracts must include a certification from the vendor that it understands and will comply with these restrictions; businesses may require monitoring and must have the ability to perform ongoing reviews, automated scans, and regular technical testing, including audits at least once every 12 months.

If a vendor subcontracts processing, the vendor must notify the business and bind the subcontractor to the same obligations.SB 435 also raises the bar for lawful research uses and deidentification. To rely on deidentification, a business must take reasonable technical measures to prevent reidentification, publicly commit to keeping the data deidentified, and contractually bind recipients to the same constraints.

Research uses must be compatible with the original business purpose, pseudonymized or deidentified, subject to technical and organizational safeguards that prevent reidentification, and limited to personnel necessary to carry out the research. The bill tightens consent standards, declares that dark patterns don’t qualify as consent, and defines specific exceptions for transfers in corporate transactions while requiring prominent notice if a buyer materially changes data uses.Finally, the bill clarifies how “sell” and “share” operate within the statute: both are broadly defined to capture many forms of disclosure, but the text lists several concrete exceptions (consumer-directed disclosures, mergers/acquisitions when use is consistent, and identifier exchanges for consumers who have opted out).

It also sets out the mechanics for verifiable consumer requests, delegating verification procedures to Attorney General regulations and allowing businesses to refuse requests they cannot verify by commercially reasonable means.

The Five Things You Need to Know

1

SB 435 adds “neural data” — defined as measurements of central or peripheral nervous system activity not inferred from other sources — to the statute’s definition of sensitive personal information.

2

Contractors and service providers must include a certification that they understand and will comply with contract prohibitions on selling, sharing, retaining, or combining personal information, and businesses may monitor compliance via audits or automated scans at least once every 12 months.

3

The bill defines precise geolocation as data that locates a consumer within a circle of 1,850 feet radius, treating that level of granularity as sensitive.

4

Deidentified data requires (1) reasonable measures to prevent reidentification, (2) a public commitment to keep the data deidentified, and (3) contractual obligations on recipients prohibiting reidentification and reuse.

5

Research uses must be compatible with the original business purpose, be pseudonymized or deidentified, be subject to technical safeguards that prohibit reidentification except as necessary to support the research, and limit access to personnel necessary to perform the research.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Subdivision (c) — Biometric information

Broadened biometric definition to include behavioral and physiological signals

This part expands biometric information beyond traditional templates (faceprints, fingerprints) to explicitly include gait, keystroke patterns, and sleep, health, or exercise data when those data contain identifying information. Practically, wearable-device telemetry and behavioral signatures used for authentication or profiling now sit clearly inside the biometric bucket and trigger any heightened handling associated with biometric data. Companies that previously treated such sensor data as low-risk will need to reassess classification, consent workflows, and data minimization.

Subdivision (d) — Business

Who counts as a regulated business and how joint ventures and voluntary certification work

The bill retains the CCPA’s existing economic thresholds (>$25M revenue, processing data on 100,000+ consumers/households, or deriving ≥50% revenue from selling data) but clarifies that entities controlled by a business and those sharing common branding are treated as the same business. It also treats joint ventures where each participant has at least 40% interest as separate businesses for most purposes, with a narrow exception on intra-venture disclosures. Finally, it preserves a route for smaller entities to opt in by voluntarily certifying compliance to the California Privacy Protection Agency, which creates a compliance pathway but also new regulatory oversight.

Subdivision (j) & (ag) — Contractor and service provider obligations

Tighter vendor contract rules and chain-of-contract obligations

The statute demands written contracts that bar vendors from selling or sharing data, using it beyond specified business purposes, retaining it outside the business relationship, or combining it with other sources — subject to narrowly defined regulatory exceptions. Contracts must include a certification of compliance and may allow the hiring business to monitor the vendor’s adherence through ongoing reviews, automated scans, and annual technical assessments or audits. If a vendor subcontracts processing, the vendor must notify the business and flow the same contractual terms down the chain, making vendor management and contract clauses central to compliance programs.

3 more sections
Subdivision (m) & (ab) — Deidentified and research rules

Concrete deidentification prerequisites and limited research reuse

To treat data as deidentified, a business must take reasonable technical steps to prevent association with an individual, publicly commit to maintaining the data as deidentified, and bind recipients contractually to the same constraints; reidentification attempts are prohibited except for limited internal testing of deidentification processes. Research that repurposes consumer data must be compatible with the original business purpose, be pseudonymized or deidentified and aggregated, implement technical and organizational safeguards prohibiting reidentification, and restrict access only to necessary personnel — creating operational checks for research teams and institutional review processes.

Subdivision (ad), (ah), (w) — Sell, share, and precise geolocation

Definitions, exceptions, and a fixed geolocation threshold

SB 435 keeps broad definitions of selling and sharing that capture many nonmonetary transfers but outlines several exceptions: consumer-directed interactions, identifier use for opted-out consumers, and transfers incident to mergers or acquisitions when usage remains consistent. The statute also fixes a specific precision threshold for geolocation (≤1,850-foot radius) to determine sensitivity, which provides a clear technical point for developers and privacy engineers to target when designing location data flows.

Subdivision (h), (l), (ak), (am) — Consent, dark patterns, and verifiable requests

Higher consent standards, dark pattern ban, and verifiability rules

Consent must be freely given, specific, informed, and unambiguous; the bill explicitly rules out consent via dark patterns and clarifies that passive UI behaviors like hovering or muting are insufficient. It also sets the standard for verifiable consumer requests: businesses must be able to verify a requester using commercially reasonable methods under Attorney General regulations and may refuse to comply with access, deletion, or correction requests they cannot verify. These mechanics affect UI design, authentication flows, and customer-service procedures.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Privacy across all five countries.

Explore Privacy in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Consumers — especially those whose neural, biometric, or fine-grained location data might otherwise be collected and repurposed; the bill treats these as sensitive and limits reuse and sharing.
  • Privacy and civil-rights advocates — the addition of neural data and expanded biometric categories closes previously ambiguous gaps and narrows options for cross-context profiling and covert identification.
  • Research institutions that implement strong pseudonymization and governance — the bill preserves research uses but makes the rules explicit, enabling compliant projects with clear technical and process guardrails.
  • Businesses that already maintain robust vendor management and deidentification practices — these firms gain legal clarity and a competitive advantage because their existing contracts and controls already satisfy many new obligations.

Who Bears the Cost

  • Ad-tech companies and data brokers — tighter definitions of share/sell and explicit prohibitions on combining data will constrain cross-context behavioral advertising models and require reengineering of data pipelines.
  • Service providers and contractors — must adopt new contractual certifications, submit to monitoring/audits, and avoid combining client data, increasing operational and legal compliance overhead.
  • Wearable-device and health-tech vendors — expanded biometric and sensitive data categories (sleep, exercise, health telemetry) will trigger higher handling standards and, in some cases, require new consent and data minimization design.
  • Businesses that lack mature deidentification processes — must invest in technical measures, public commitments, and contractual frameworks to rely on deidentification as a compliance strategy, which can be expensive and complex.

Key Issues

The Core Tension

SB 435 pits stronger protections for highly sensitive and hard-to-observe personal data (neural signals, behavioral biometrics, fine-grained location) against increased operational complexity and costs for businesses and vendors; the central dilemma is how to block abusive profiling and covert identification without imposing compliance rules that are technically ambiguous, expensive to operationalize, or that unintentionally hinder legitimate research and product innovation.

The bill tightens many definitional levers the CCPA uses to allocate risk and responsibility, but those same levers create implementation friction. For example, adding neural data and broad behavioral biometrics to sensitive categories protects consumers but raises immediate questions about scope: many devices and services generate continuous sensor streams that may or may not be "identifying" depending on downstream processing.

Firms will need to operationalize the line between raw telemetry and identifying biometric templates, and that line is fact-sensitive and technical.

Contractual monitoring and annual audits give businesses tools to police vendors, but they also shift compliance burdens onto both buyers and sellers. Smaller vendors may find the required certifications, monitoring windows, and prohibitions on combining data economically unsustainable, which could shrink vendor choice or push businesses to bring processing in-house.

The deidentification regime depends on "reasonable measures" and public commitments — language that leaves room for dispute and regulatory interpretation and will likely generate litigation and administrative guidance about acceptable technical standards (e.g., what constitutes sufficient pseudonymization or resistance to modern reidentification techniques).

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.