AB 1883 defines a broad category of "workplace surveillance tools" and forbids employers from using several high‑risk technologies — including emotion‑recognition systems, gait recognition, neural data collection, and most facial recognition — as part of workplace monitoring or automated decisionmaking. The bill also bars employers from using surveillance to identify or profile workers who are engaging in legally protected activity or to infer protected characteristics under California Government Code Section 12940.
Enforcement rests primarily with the Labor Commissioner (including investigatory powers, citations, temporary relief, and civil actions), but public prosecutors and affected workers or their exclusive representatives may also sue. The statute authorizes injunctive relief, punitive damages, attorney fees, and a statutory penalty of up to $500 per employee per violation, while preserving stronger local ordinances.
At a Glance
What It Does
AB 1883 creates statutory definitions for surveillance and biometric categories, then bars employers from deploying tools that infer emotions, identify gait, collect neural data, or use facial recognition except in very narrow device/access contexts. It also prohibits using surveillance to infer protected activity or protected class status.
Who It Affects
All California employers — private and public, including labor contractors and state or local agencies — plus vendors that supply workplace monitoring systems; workers covered include employees and independent contractors. Enforcement implicates the Labor Commissioner, local prosecutors, and civil courts.
Why It Matters
The bill is one of the first California employment statutes to single out emotion‑recognition and neural technologies and to limit facial recognition in the workplace, shifting legal risk onto employers and vendors and creating new compliance obligations and litigation exposure.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
AB 1883 starts by casting a wide net: it defines "workplace surveillance tools" to include any system that passively collects worker data by means other than a person’s direct observation—video, audio, continuous time‑tracking, geolocation, biometric sensing, and similar technologies. "Worker" is defined broadly to include employees and independent contractors and reaches public employers and their political subdivisions. The bill then supplies targeted technical definitions — emotion recognition, facial recognition, gait recognition, and neural data — so the prohibitions are tied to specific technological capabilities rather than generic privacy concepts.
Substantively, the bill forbids employers from using surveillance tools that (1) violate or prevent compliance with labor, safety, employment, or civil rights laws, (2) identify or profile workers engaged in protected activity, or (3) incorporate any of the enumerated technologies. Facial recognition remains allowable only when used strictly to unlock a locked device or to grant access to locked or secured areas.
The statute also bars employers from using surveillance to infer a worker’s protected status under Government Code Section 12940, which covers categories like race, sex, religion, disability, and other protected classes.On enforcement, AB 1883 places primary responsibility with the Labor Commissioner, giving that office investigatory powers, authority to issue citations, and the ability to seek temporary relief under existing Labor Code procedures. Public prosecutors may enforce the law as well, and affected workers or their exclusive representatives may sue in civil court for damages — including punitive damages — and seek injunctive relief and attorney fees.
Monetary relief includes a statutory penalty of up to $500 per employee per violation, with a carveout that a plaintiff cannot recover both a statutory penalty and a civil penalty for the same violation.The bill also preserves the ability of cities and counties to enact local ordinances that provide equal or greater protections. Notably absent are affirmative compliance requirements such as mandated audits, transparency disclosures, consent regimes, or data‑retention limits; the statute focuses on categorical bans and enforcement remedies rather than procedural safeguards or certification mechanisms for permissible technologies.
The Five Things You Need to Know
The bill expressly bans employer use of "emotion recognition technology," "gait recognition technology," and collection of "neural data.", Facial recognition is prohibited for most workplace uses but is permitted when strictly limited to unlocking a locked device or granting access to locked or secured areas.
Employers may not use surveillance to identify or profile workers engaged in protected activity or to infer protected characteristics under Government Code Section 12940.
Enforcement is available via the Labor Commissioner (investigations, citations, temporary relief), public prosecutors, and private civil suits that may seek injunctive relief, punitive damages, and attorney fees.
Penalties include up to $500 per employee for each violation, but a plaintiff cannot recover both a statutory penalty and a civil penalty for the same violation.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Definitions — who and what the law covers
This section creates the statutory vocabulary the rest of the bill uses: a broad definition of "workplace surveillance tool" (covering passive sensing and automated collection), explicit definitions for emotion recognition, facial and gait recognition, neural data, and a broad worker definition that includes independent contractors and public employees. The practical effect is to make the prohibitions technology‑specific while capturing most contemporary forms of automated monitoring used in workplaces.
Prohibitions on specific surveillance practices and inferences
Section 1581 imposes three core limits: employers may not deploy surveillance that (A) prevents compliance with labor, safety, or civil rights law, (B) identifies or profiles workers engaged in protected activity, or (C) incorporates specified technologies (emotion recognition, gait recognition, neural data, and most facial recognition). It also makes it unlawful to infer a worker’s protected status under California’s anti‑discrimination code. The facial recognition carve‑out is narrow and strictly limited to device unlocking and secure‑area access.
Enforcement, remedies, and penalties
Section 1582 assigns enforcement primarily to the Labor Commissioner, authorizing investigations, citations, temporary relief, and civil actions under existing Labor Code procedures. It permits public prosecutors to bring actions and gives individual workers or their representatives a private right of action including damages and punitive damages. The statute sets a penalty of up to $500 per employee per violation, clarifies recovery cannot duplicate statutory and civil penalties for the same violation, and preserves stronger local ordinances.
This bill is one of many.
Codify tracks hundreds of bills on Employment across all five countries.
Explore Employment in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Frontline workers and gig workers — the bill protects employees and independent contractors from intrusive biometric and affective monitoring and from being profiled for engaging in protected activity.
- Workers asserting protected activity (eg, organizing, whistleblowing, taking protected leave) — the statute blocks surveillance uses that identify or single out protected actions, reducing the risk of retaliation based on monitored behavior.
- Civil‑liberties and privacy advocates — the law curtails emerging biometric and affective technologies in employment settings that raise systemic discrimination and privacy risks.
- Municipalities with stronger local rules — the preservation of local ordinances benefits cities and counties that already provide greater worker protections because those rules remain enforceable.
Who Bears the Cost
- Employers across sectors (private and public) — must cease or retrofit surveillance programs that incorporate banned technologies, face litigation risk, and may incur costs for alternative security or productivity tools.
- Vendors of emotion‑recognition, gait, and neural‑data systems — these companies will likely lose California workplace customers or need to redesign products to avoid prohibited capabilities.
- Labor contractors and staffing agencies — explicitly included as employers, they bear the same compliance burden and potential per‑employee penalties.
- The Labor Commissioner and local prosecutors — enforcement will require technical expertise, investigative capacity, and resources to pursue sophisticated surveillance vendors and employer systems.
Key Issues
The Core Tension
The central dilemma is balancing worker privacy, safety, and anti‑discrimination protections against legitimate employer needs for security, access control, and productivity measurement: AB 1883 decisively protects workers by banning certain technologies, but it leaves ambiguous technical definitions and omits procedural safeguards, forcing employers, vendors, and enforcers to navigate hard technical and evidentiary questions without clear certification or compliance pathways.
AB 1883 takes a categorical, technology‑centered approach rather than prescribing operational controls. That makes the statute easy to apply to a clear list of high‑risk technologies, but it also creates line‑drawing challenges.
For example, the definition of "neural data" excludes information inferred from nonneural sources, which raises questions about whether certain wearables or analytics that infer stress or cognitive state will fall inside or outside the ban. Similarly, the emotion‑recognition prohibition hinges on whether a supplier’s model is characterized as inferring an emotional state versus classifying surface biometric signals — a technical distinction that will likely be litigated.
The facial recognition exception for device unlocking and secured‑area access is narrowly stated but ambiguous in practice: court‑access systems, shared‑device scenarios, and access control integrations with enterprise directories could all test the limits of "strictly" permitted uses. Enforcement design also creates tradeoffs: the per‑employee per‑violation penalty could produce large damages exposure for employers that deployed systemically across a workforce, but the $500 cap per employee may be modest relative to systemic privacy harms or the costs of remediation.
Finally, the bill does not create affirmative transparency, auditing, or data‑minimization rules (no mandated impact assessments, vendor attestations, or retention limits), so regulators and courts will need to develop implementation practices and discovery approaches to prove violations.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.