SB 384 makes it unlawful in California to sell, license, provide, or use a “price‑setting algorithm” when the actor intends or reasonably expects it will be used by two or more competitors in the same market and the algorithm processes competitors’ confidential or nonpublic input data to set prices, supply, rent, or occupancy. The prohibition covers algorithm vendors and users and treats multiple users and repeated use as separate violations.
The bill creates a civil enforcement regime led by the Attorney General, district attorneys, and local city or county counsel that can seek injunctions, restitution, actual damages, and civil penalties up to $1,000 per violation, plus attorneys’ fees for the public enforcers. It also includes an affirmative defense for users who can prove they exercised due diligence and obtained written assurances that the algorithm does not process nonpublic input data, and it defines key terms such as “nonpublic input data,” “price‑setting algorithm,” and “artificial intelligence.”
At a Glance
What It Does
SB 384 prohibits offering or using software or AI that processes confidential competitor inputs to produce pricing, supply, rental, or occupancy strategies when it’s intended for use by multiple competitors in the same market. The statute counts each licensed user and each month of unlawful use as separate civil violations.
Who It Affects
Algorithm vendors and platforms that build or distribute dynamic‑pricing or rental‑management tools, commercial operators and landlords who use such tools, legal and compliance teams at firms that price at scale, and state and local prosecutors charged with enforcement.
Why It Matters
The bill targets algorithmic channels that can facilitate tacit collusion without requiring a traditional agreement; it arms public enforcers with a statutory vehicle to pursue algorithm‑driven coordination and forces vendors and buyers to document data provenance and compliance practices.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
SB 384 draws a legal line around algorithms that use confidential inputs from competitors to produce actionable pricing or rental recommendations. It makes no distinction between a bespoke machine‑learning model or a simpler algorithmic process: if the software ingests nonpublic competitor data and its purpose is to generate pricing, supply, rent, or occupancy strategies—and the provider intends or reasonably expects multiple competitors will use it—the statute bars selling, licensing, providing, or using that tool in California.
The bill builds in two enforcement levers. First, it multiplies liability exposure: for sellers and licensors, every authorized user is a separate violation; for users, each calendar month of unlawful use is a separate violation.
Second, it centralizes civil enforcement with state and local public prosecutors who can seek injunctive relief, restitution, actual damages, and up to $1,000 per violation; those public plaintiffs recover their attorneys’ fees if they prevail. Private plaintiffs do not receive the same fee-shifting remedy under this text.Practically, the statute allows an affirmative defense for users who can prove—by the evidentiary standard stated in the text—that they performed reasonable due diligence and obtained written assurances from the provider that the algorithm does not process nonpublic input data.
The bill narrows the definition of nonpublic input data by excluding information collected more than one year before the algorithm’s use, and it expressly excludes multiple listing services from the definition of a price‑setting algorithm.The text also clarifies that the new statute does not displace existing antitrust law and defines “artificial intelligence” and “competitors” for the statute’s purposes. Together, the definitions and remedies create compliance pressure on vendors to document data sources and on buyers to demand warranties and contractual protections—while leaving open tensions about scope, proof, and how prosecutors will prioritize cases.
The Five Things You Need to Know
The law treats each authorized user of a prohibited algorithm as a separate violation for sellers and licensors, multiplying exposure for vendors with broad customer bases.
For users, liability accrues by calendar month—each month of unlawful use counts as an independent violation, increasing potential cumulative penalties.
The statute excludes data that was collected more than one year prior to the algorithm’s use from the definition of “nonpublic input data,” effectively allowing algorithms to rely on older competitor data.
Multiple listing services (MLS) are explicitly carved out of the “price‑setting algorithm” definition, so MLS platforms are not swept into the ban.
Enforcement is civil and public—only the Attorney General, district attorneys, city attorneys, or county counsel may sue, and the court awards attorneys’ fees to those public enforcers if they prevail.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Broad prohibition on sale, provision, or use of certain price‑setting algorithms
Subdivision (a) establishes the core offense: anyone who sells, licenses, provides, or uses a price‑setting algorithm with the intent or a reasonable expectation it will be used by multiple competitors in the same market is barred if the algorithm processes competitors’ confidential or nonpublic inputs to set prices, supply levels, rents, or occupancy. The provision combines an objective act (sale, license, provision, use) with a mental‑state element (intent or reasonable expectation) and ties liability to the algorithm’s data inputs rather than to any explicit agreement among competitors.
Affirmative defense tied to written assurances and due diligence
Subdivision (b) creates an affirmative defense for users who can show they performed reasonable due diligence and secured written assurances from the provider that the algorithm does not process nonpublic input data. The provision requires a user to shoulder an evidentiary showing—language in the bill conflates standards (see The Fine Print)—meaning buyers must document diligence steps and obtain supplier warranties to rely on the defense.
Violation counting: per user for vendors, per month for users
Subdivision (c) dictates how violations are quantified: for vendors, every authorized user or licensee counts as a distinct violation; for users, each calendar month of unlawful operation is a separate violation. This construct amplifies penalty exposure for vendors with many customers and for continued users, turning a single noncompliant product into a stream of violations.
Enforcement remedies and contractual effect
Subdivision (d) authorizes civil actions by state and local public prosecutors seeking actual damages, injunctions, restitution, and civil penalties up to $1,000 per violation, and awards reasonable attorneys’ fees and costs to the prevailing public plaintiff. Subdivision (e) invalidates any contract provisions that conflict with the statute. Together those clauses mean vendors cannot contract around statutory restrictions and face public enforcement rather than private class actions under this text.
Non‑displacement of antitrust law
The bill explicitly states it does not limit the applicability of antitrust laws, signaling that prosecutors and private litigants can still pursue traditional antitrust claims. The new statute operates alongside, not instead of, existing competition remedies, which may include treble damages at the federal level.
Definitions and important carveouts
Subdivision (g) supplies the operative definitions: it defines artificial intelligence, adopts the Clayton Act definition for antitrust laws, defines nonpublic input data but excludes data older than one year, and defines price‑setting algorithms while excluding MLS systems. It also defines competitors for the statute’s application. These definitional choices will govern the reach of the prohibition and the scope of acceptable data sources.
This bill is one of many.
Codify tracks hundreds of bills on Economy across all five countries.
Explore Economy in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Consumers facing higher prices: by targeting tools that can enable tacit coordination, the statute aims to reduce algorithmically facilitated price convergence that can raise consumer costs across markets.
- Small independent sellers and landlords: firms that do not use centralized algorithmic pricing tools may avoid being drawn into coordinated pricing dynamics and gain competitive breathing room against large platform‑led pricing services.
- State and local enforcers: the Attorney General, district attorneys, and city/county counsel gain a clear statutory tool for pursuing algorithmic coordination without relying solely on traditional antitrust pleadings.
Who Bears the Cost
- Algorithm vendors and SaaS platforms that provide dynamic‑pricing or rental‑management software: they must assess and possibly redesign data ingestion and processing practices, add compliance documentation, and face multiplied liability exposures for each customer.
- Commercial operators and landlords using third‑party pricing tools: buyers must perform due diligence, demand written assurances, and track monthly usage to avoid per‑month liability, increasing legal and operational burdens.
- Legal and compliance teams at affected firms: these departments must draft contractual warranties, maintain provenance records for training data, and manage potential litigation risk, creating ongoing compliance costs.
Key Issues
The Core Tension
SB 384 confronts a classic trade‑off: it seeks to block algorithmic channels that can facilitate tacit collusion and protect market competition, but in doing so it risks suppressing legitimate, efficiency‑driven pricing tools and imposes compliance and litigation costs on vendors and buyers—forcing regulators and courts to balance preventing covert coordination against preserving lawful algorithmic innovation.
The statute packs several implementation challenges that will matter in practice. First, the mens rea framework—prohibiting conduct when a person has “intent or reasonable expectation” that multiple competitors will use the tool—invites fact‑intensive inquiries into what vendors and buyers knew or should have anticipated.
That is a flexible standard for prosecutors but creates uncertainty for developers of multi‑tenant platforms. Second, the definition of “nonpublic input data” excludes data older than one year, which eases compliance by allowing older competitor information but also encourages reliance on stale datasets that may still enable coordinated outcomes.
Third, the affirmative defense requires written assurances and documented due diligence, but the bill’s evidentiary language is internally inconsistent; parties and courts will likely litigate the exact burden and what qualifies as sufficient due diligence.
Enforcement design creates other tradeoffs. Multiplying violations by user and by month sharply increases potential exposure, yet the statutory per‑violation cap of $1,000 is modest on its face; the real deterrent will come from aggregation, injunctive relief, and reputational harm, not per‑unit fines.
The exclusivity of public enforcement—only state and local prosecutors may sue under this section—concentrates decision‑making about priorities and could leave many harms to be pursued under traditional antitrust law if prosecutors decline to act. Finally, the MLS carve‑out and the statute’s narrow focus on nonpublic competitor inputs leave open many borderline products and services where lawful personalization or publicly sourced dynamic pricing intersects with competitive risk.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.