Codify — Article

California AB 1159 limits K–12 ed‑tech use of student data and AI training

Creates a standalone student‑data regime for operators of K–12 sites and apps, restricting commercial uses, requiring retention/deletion rules, and barring the use of pupil data to train generative AI.

The Brief

AB 1159 establishes a K–12‑specific privacy framework that governs “operators” of websites, online services, and apps designed or marketed for K–12 school purposes. The bill defines a broad category of “covered information” (everything from names and contact details to grades, health data, device identifiers, and behavioral/biometric data) and directs what operators may and may not do with that material.

Rather than amending CCPA or FERPA directly, AB 1159 imposes standalone prohibitions and duties: it curtails commercial exploitation of pupil data (including targeted advertising and many sales), requires data minimization and deletion practices, mandates written retention policies, and expressly bars using pupil covered information to train generative AI systems. The law includes limited exceptions for deidentified data, certain assessment uses, and legally compelled disclosures, while preserving obligations under federal education statutes.

At a Glance

What It Does

Creates prohibitions on targeted advertising, profiling for non‑school purposes, and most sales of pupil data; requires operators to implement reasonable security, written retention policies, and deletion practices; and bans using covered pupil information to train generative AI systems. It allows deidentified and aggregated uses for product improvement and demonstration, subject to contractual and technical safeguards.

Who It Affects

Ed‑tech vendors, cloud and software providers that knowingly provide services for K–12, national assessment providers, and local educational agencies that contract with those operators. Small app developers built for school use and marketplaces that host such apps will also confront compliance obligations and contractual scrutiny.

Why It Matters

AB 1159 shifts compliance from schools to operators and limits common commercial business models (ad targeting, data brokerage, AI training on student data). Vendors that rely on behavioral profiling or reuse of pupil data will need to change product design, contracts, and data governance to keep serving California schools.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

AB 1159 starts by setting the universe: an “operator” is an entity that knows its site, service, or app is used for K–12 purposes and was designed or marketed for that audience. “Covered information” is intentionally sweeping — it includes not only names and contact details but education records, test scores, special education data, health and biometric information, device identifiers, search activity, photographs, and geolocation. The bill also imports California Civil Code definitions for “artificial intelligence” and “generative artificial intelligence” to anchor later prohibitions.

The bill establishes bright‑line limits on how operators may exploit covered information gathered through K–12 products. It bars targeted advertising based on data collected through those products, forbids building pupil profiles for commercial or non‑school purposes, and restricts the sale of pupil information except in narrow circumstances (notably acquisitions and certain assessment transfers).

Crucially, the statute disallows using any covered information — including persistent identifiers — to train generative AI or to develop AI systems, removing a common downstream use of product logs and behavioral data.Operational duties sit alongside the prohibitions. Operators must implement reasonable security practices tailored to the sensitivity of pupil data, adopt and publish a written data retention policy explaining purposes and deletion timeframes, and generally retain covered information only as long as needed for the stated purpose.

Schools or local educational agencies can request deletion of data under their control; parents or former pupils can request deletion of certain CCPA‑excluded covered information once the pupil has been unenrolled for 60 days (with operators allowed to require proof). The bill also preserves mandatory permanent pupil records and excludes from deletion the types of records that statute or regulation requires schools to keep.The law contains carve‑outs that matter in practice.

Deidentified and aggregated data may be used to improve products or demonstrate effectiveness, provided the operator follows specific deidentification commitments and contracts that prevent reidentification. National assessment providers get a limited pathway to disclose or receive data for assessment, admissions, or other school‑related purposes.

The statute also defers to federal or state law where disclosure is compelled and explicitly preserves rights under IDEA, FERPA, and the Rehabilitation Act, creating a layered compliance landscape operators must navigate.

The Five Things You Need to Know

1

Operators may not target advertising to pupils based on information collected through K–12 sites, services, or apps, whether on that platform or through other channels.

2

The bill expressly prohibits using any covered information or persistent identifiers collected via K–12 products to train generative AI systems or to develop AI systems.

3

A parent or a former pupil (18+) can request deletion of the operator’s CCPA‑excluded covered information if the pupil has not been enrolled in the local educational agency for at least 60 days; operators may require documentation verifying unenrollment.

4

Sale of pupil information is forbidden except for (a) transfers incident to an acquisition or merger where the successor remains bound by the statute, and (b) transfers by a national assessment provider to schools or higher‑education institutions when strictly for assessment, admissions, or related school/higher‑ed purposes.

5

Operators must publish a written data retention policy (available on request) stating collection purposes, retention rationale, and deletion timeframes, and generally must delete covered information when the purpose for collection is completed, subject to statutory permanent‑record exceptions.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 22584(a)

Definitions and scope — who and what the law covers

This section defines the statute’s key terms: “operator,” “covered information,” “deidentified information,” and imports California’s definitions of AI and generative AI. Practical consequence: a broad swath of product telemetry and education records fall within scope if the vendor knows the product is for K–12. The “operator” definition captures primary vendors and entities acting on their behalf; it also brings cloud services into scope when used as K–12 operators.

Section 22584(b)

Primary prohibitions on commercial uses and AI training

Subdivision (b) lists forbidden activities: targeted advertising tied to data collected by K–12 products; building profiles about pupils for non‑school uses; selling pupil information except in narrow scenarios; disclosures not in service of K–12 purposes; and using covered information to train generative AI or develop AI systems. For vendors, these are operational blockers: telemetry collection, personalization logs, and advertising integrations must be re‑architected or segregated to comply.

Section 22584(c)–(d)

Permitted internal uses, security, deletion, and retention rules

The statute permits operators to use information to maintain and improve their own services but requires reasonable security measures calibrated to the sensitivity of pupil data. Operators must comply with deletion requests from schools and, under a separate path, from parents/former pupils for certain CCPA‑excluded data after 60 days of unenrollment. The law also mandates retention only as long as necessary for stated purposes and requires a written data retention policy; however, it carves out mandatory permanent records and standardized test records from deletion requirements.

2 more sections
Section 22584(e)–(g)

Research, deidentified and aggregated uses, and assessment exceptions

Operators may disclose covered information where federal/state law requires it, and they may engage in legitimate research under state/federal rules if the data are not used for advertising or profiling. The bill allows deidentified and aggregated pupil data to be used to improve products or to demonstrate effectiveness — but sets contractual and technical expectations for deidentification and recipient obligations. National assessment providers have limited latitude to transfer data to schools and higher‑ed institutions for assessment and admissions purposes.

Section 22584(h)–(p)

Carve‑outs, preservation of federal rights, and non‑application

These clauses preserve law‑enforcement authority to obtain data pursuant to legal process, protect operators’ ability to support adaptive or customized learning, and exempt general‑audience sites not designed or marketed for K–12. The section explicitly leaves untouched obligations under IDEA, FERPA, and the Rehabilitation Act, and clarifies that app marketplaces and interactive computer service providers are not responsible for policing third‑party app compliance — limiting the bill’s reach against platform intermediaries.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Education across all five countries.

Explore Education in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • K–12 pupils and their families — by limiting commercial profiling, targeted ads, sales, and the use of sensitive categories (health, special education, biometrics) for non‑school purposes, the bill reduces avenues for commercial exploitation and potential harms from reidentification.
  • Local educational agencies and schools — gain contractual leverage and clearer statutory backstops to demand deletion, retention limits, and security from vendors, simplifying procurement terms and vendor oversight.
  • Privacy‑focused ed‑tech providers and product teams — those that already minimize data collection and avoid advertising or resale models gain a competitive advantage because compliance will be less disruptive to their business models.

Who Bears the Cost

  • Ed‑tech operators and software vendors — must reengineer data flows, change product features that rely on profiling or logging, implement deletion and retention tooling, and rewrite contracts to bind downstream recipients and service providers.
  • Small developers and start‑ups targeting schools — face compliance and verification costs (e.g., deidentification methods, documentation of unenrollment) that may be disproportionate to their resources or customer base.
  • Local educational agencies and schools — while beneficiaries, they may incur administrative overhead to verify unenrollment, manage deletion requests, and audit vendors; they will also have to update procurement and contracting practices.

Key Issues

The Core Tension

The central dilemma is balancing strong, enforceable constraints on commercial exploitation and AI training using pupil data against the legitimate instructional benefits of data‑driven personalization and product improvement: protect privacy and you risk hobbling useful adaptive tools and product innovation; permit broader uses and you expose vulnerable students to profiling, commercial targeting, and reidentification risks.

AB 1159 threads a narrow needle between protecting pupils and preserving useful educational functionality, but it leaves several operational questions unresolved. The bill allows operators to use deidentified data to improve products, yet the deidentification standard hinges on an operator’s reasonable measures and contractual promises — a fact pattern that creates compliance and enforcement ambiguity.

Who validates deidentification practices and how regulators will assess claims that data cannot be reidentified are left implicit, creating risk for vendors relying on deidentification as a safe harbor.

The statute’s ban on using covered information to train generative AI is blunt and potentially disruptive. At the same time the bill preserves adaptive learning and customized instruction (subdivision (i)), creating a tension between prohibiting model training on pupil data and allowing on‑platform personalization.

Vendors will need technical approaches (local models, differential privacy, synthetic data) and clear guidance on whether model updates performed on‑device or via deidentified aggregates comply. Finally, enforcement mechanics and penalties are not specified in this text; absent a designated enforcement mechanism or damages framework, compliance incentives will depend heavily on contract remedies and the diligence of schools during procurement.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.