Codify — Article

California SB 947 defines ‘automated decision systems’ and key terms for employment use

A single section of definitions broadly scopes which employers, workers, and technologies fall under future ADS rules — and leaves several questions about coverage and enforcement.

The Brief

SB 947’s text (Section 1520) provides a set of definitions that will determine what counts as an “automated decision system” (ADS) in the employment context and who is treated as an employer or worker. The measure defines ADS broadly to include machine learning, statistical modeling, data analytics, and AI that issue scores, classifications, or recommendations that materially impact people, while explicitly excluding basic infrastructure tools like firewalls and spam filters.

Those definitional choices matter because they shape the reach of any substantive obligations that follow elsewhere in the bill or future regulations. The employer definition is unusually broad — it expressly folds in state and many local public entities, school districts, contractors, and higher-education institutions (with a caveat about the University of California) — and the worker and worker-data definitions extend coverage to independent contractors and any information that “identifies, relates to, or describes” a worker.

In short: the section maps the battlefield for automated hiring, evaluation, and surveillance tools, even though it itself does not impose duties or penalties.

At a Glance

What It Does

Section 1520 supplies precise definitions for terms that determine ADS coverage in employment settings: what counts as an ADS, what outputs are covered, who is an employer, who is a worker, and what constitutes worker data. It also lists express exclusions for basic cybersecurity and infrastructure tools.

Who It Affects

The language targets any entity that controls employment terms — including private employers, labor contractors, state and local agencies, school districts, and certain public universities — plus vendors of HR technologies whose outputs materially affect individuals. It also treats independent contractors as workers and treats virtually any worker-related information as worker data.

Why It Matters

Definitions establish the legal perimeter: a broad ADS definition and sweeping worker-data language will pull many HR analytics and recruiting tools under scrutiny, while the exclusions and undefined phrases like “materially impacts” create uncertainty about borderline systems. Compliance teams, HR-tech vendors, and public employers should treat this section as the baseline for future obligations.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

SB 947’s Section 1520 is a foundational, definitional section: it does not itself require audits, disclosure, or mitigation, but it sets the terms that will control what systems and actors fall within whatever substantive obligations the legislature imposes. The bill treats an ADS as any computational process from machine learning, statistical modeling, data analytics, or AI that issues simplified outputs — scores, recommendations, classifications — used to assist or replace human decisionmaking and that “materially impacts” natural persons.

That “materially impacts” trigger is the gatekeeper that will decide whether a tool used in hiring or evaluation is an ADS for legal purposes.

The definition package is deliberately expansive in some places. “Worker” covers not only employees but independent contractors who provide services, which means gig workers or contractors monitored or scored by platforms may be brought into scope. “Worker data” is defined to include any information that identifies, relates to, or describes a worker, without limiting how that information was collected or inferred — a formulation that captures inferred attributes, behavioral signals, and derived inferences as well as raw personnel records.At the same time the drafters carved out several explicit exclusions: spam filters, firewalls, antivirus, identity and access management tools, calculators, and databases or datasets are not ADSs under this section. That exclusion narrows the field for basic infrastructure, but it leaves open gray areas — for example, whether an identity system that also scores access risk or a dataset with embedded scoring logic counts as excluded.

The bill also defines “predictive behavior analysis” and “individualized,” signaling legislative attention to systems that infer or modify behavior and to systems that operate at both individual and class levels.Finally, the employer definition enumerates a broad list of public entities — cities, counties, special districts, transit districts, school districts, community colleges, and, conditionally, the University of California — plus labor contractors, explicitly bringing government employers and contracting intermediaries within the same definitional frame as private-sector employers. That inclusion matters because many public agencies use vendor-supplied analytics for recruiting, scheduling, or safety, and this text would make such systems susceptible to whatever restrictions and transparency requirements appear later in the bill or its implementing regulations.

The Five Things You Need to Know

1

Section 1520(c) defines an “automated decision system” to include machine learning, statistical modeling, data analytics, or AI that issues scores, classifications, or recommendations used to assist or replace human decisionmaking and that materially impacts natural persons.

2

Section 1520(c) explicitly excludes spam filters, firewalls, antivirus software, identity and access management tools, calculators, databases, datasets, or other compilations of data from the ADS definition.

3

Section 1520(d)(1) defines “employer” very broadly to include private employers and a long list of public entities (cities, counties, special districts, transit districts, school districts, community college districts) and references the University of California conditionally.

4

Section 1520(d)(2) expands “employer” to include labor contractors, which pulls intermediaries that place or manage workers into the employer definition.

5

Section 1520(j) defines “worker data” as any information that identifies, relates to, or describes a worker regardless of how it was collected or inferred, thereby capturing inferred attributes and derived analytics as well as direct records.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 1520(a)

ADS outputs — what counts as an output

Subsection (a) lists the kinds of outputs an ADS produces: information, data, assumptions, predictions, scoring, recommendations, decisions, or conclusions. Practically, that phrasing ensures that both raw numerical scores and higher-level recommendations are treated the same way; vendors that provide ranked candidate lists, risk scores, or automatic recommendations fall into the definition if the rest of the ADS criteria apply.

Section 1520(b)

Artificial intelligence — a broad, autonomy-focused definition

Subsection (b) defines AI as a machine-based system that varies in autonomy and infers how to generate outputs to influence environments. That description is technology-agnostic and captures systems with differing degrees of human oversight, which prevents vendors from sidestepping the definition by arguing a system is low-autonomy. The emphasis on influencing environments signals the drafters’ intent to capture systems that have real-world effects on workers.

Section 1520(c)

Automated decision system — scope and exclusions

Subsection (c) is the central definitional gate: an ADS must be derived from machine learning, statistical modeling, data analytics, or AI and it must issue simplified outputs used to assist or replace human discretionary decisionmaking that materially impacts people. The subsection also lists express exclusions for basic cybersecurity, identity tools, calculators, and data compilations. The combination — wide inclusion criteria plus narrow, enumerated exclusions — will focus disputes on borderline technologies (for example, a dataset enriched with predictive scores) and on what qualifies as a “material” impact.

3 more sections
Section 1520(d)(1)–(2)

Employer — expansive public and contractor coverage

Subsections (d)(1) and (d)(2) define “employer” to include direct and indirect employers, state and local government branches, school districts, special districts, and more; (d)(2) explicitly includes labor contractors. This language brings a wide range of entities — not only private companies but also public agencies and staffing intermediaries — within the ADS framework, so obligations tied to the ADS definition will apply across both sectors unless other text narrows them.

Section 1520(e)

Employment-related decision — what qualifies as material impact

Subsection (e) enumerates the types of decisions treated as employment-related: wages, benefits, hours, hiring, discipline, promotion, termination, assignment of work, training access, productivity requirements, and workplace health and safety. By listing these concrete employment outcomes, the section clarifies many of the contexts where an ADS’s output would be considered materially impactful — and therefore within scope — while leaving room for interpretation about borderline operational uses.

Section 1520(h)–(j)

Predictive behavior, individualized systems, and worker data

Subsections (h) through (j) define specialized concepts: predictive behavior analysis (systems that predict, infer, or modify behavior or emotional state), “individualized” (specific to an individual or to groups with shared characteristics), and “worker data” (any information that identifies, relates to, or describes a worker). Together these definitions expand coverage beyond explicit personnel files to inferred psychological states and grouped inferences, meaning analytics used to infer engagement, risk, or propensity could be treated as regulated outputs.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Employment across all five countries.

Explore Employment in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Workers (employees and independent contractors) — The worker and worker-data definitions explicitly cover independent contractors and inferred data, which could extend protections or transparency to gig and contract workers who are monitored or scored.
  • Labor and civil-rights advocates — Clear definitions of predictive behavior analysis and individualized systems allow advocates to target specific types of intrusive surveillance or behavior-modifying tools in policy or litigation.
  • Regulators and compliance officers — Having enumerated definitions reduces ambiguity when deciding whether a system falls under regulatory obligations, enabling more consistent enforcement and guidance drafting.
  • Public-sector employees and unions — The explicit inclusion of state and local public entities, school districts, and other government bodies brings many public-sector tools into the same definitional framework as private-sector systems, supporting consistent worker protections across sectors.

Who Bears the Cost

  • HR-technology vendors and data-analytics firms — Vendors that supply scoring, ranking, or predictive systems to employers will face broader compliance exposure because their outputs are explicitly captured as ADS outputs and worker data can include inferred attributes.
  • Employers (private and public) — Organizations that use analytics for hiring, scheduling, evaluation, or discipline will need to inventory systems, assess whether they meet the ADS threshold, and prepare for compliance actions tied to subsequent substantive rules.
  • Labor contractors and staffing agencies — By including labor contractors in the employer definition, the bill assigns obligations to intermediaries that place or manage workers, potentially increasing contract and operational complexity.
  • IT and security teams — The enumerated exclusions for infrastructure tools create technical boundary issues; teams will need to segregate or document which components are pure infrastructure and which perform scoring or inference to avoid unexpected ADS classification.

Key Issues

The Core Tension

The central tension is between broad, protective coverage for workers (including contractors and inferred data) and the legal and operational uncertainty that broad, technical definitions create for employers and vendors: protecting workers from opaque, behavior-modifying systems argues for wide definitions, but wide definitions also risk ensnaring benign infrastructure and useful automation, producing compliance costs and legal ambiguity without clear bright-line standards.

The section is consequential precisely because it is only definitional: it will determine the reach of any substantive obligations or prohibitions placed on ADSs later in the code or via regulation. That raises two implementation challenges.

First, the phrase “materially impacts” is consequential but undefined here; regulators and courts will have to develop a workable threshold for materiality in employment settings, which could vary by outcome (hiring vs. scheduling) and by worker population. Second, the broad worker-data formulation — covering any information that identifies, relates to, or describes a worker “regardless of how…collected, inferred, or obtained” — sweeps in inferred signals and third-party enrichments, creating data-governance and minimization tensions for employers and vendors.

The explicit exclusions (spam filters, firewalls, identity and access management, etc.) narrow the field only if those excluded tools do not perform or feed into scoring or behavioral inference. In practice, many HR and security systems blur the line: an identity system that produces risk scores, or a dataset augmented with predictive variables, could be treated as an ADS despite the literal exclusion for “databases.” That ambiguity will push disputes into technical forensics about how systems are built and whether they “issue” outputs as defined.

Finally, the expansive employer definition that pulls in public entities and labor contractors raises administrative questions: which administrative body enforces ADS rules across this diverse set of actors, and will enforcement be uniform or fragmented across agencies?

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.