Codify — Article

California bill lets former child influencers force removal of paid family posts

SB 1247 creates a platform-based request process and a private right of action for adults featured as minors in compensated social media posts.

The Brief

SB 1247 requires social media platforms to offer a clear mechanism for adults who were featured as minors in monetized family posts to request that the family member who posted the material delete or edit it. The bill limits coverage to “paid content” posted by a parent, legal guardian, or family member who received compensation for sharing the post.

If the family poster does not comply, the former child influencer may sue for actual and statutory damages, injunctive relief, and fees. The proposal focuses enforcement on the person who posted the content rather than on the platform, while obligating platforms to operate the request-and-notify system.

At a Glance

What It Does

Establishes a statutory process allowing adults who were featured as minors in monetized family social posts to ask the poster to delete or edit that content, and creates a civil remedy if the poster refuses. The obligation to provide the request mechanism sits with the social media platform; the liability for noncompliance sits with the vlogger (parent/guardian/family member) who posted the paid content.

Who It Affects

Former child influencers now aged 18+, parents or family members who monetize posts that include minors, and social media platforms that must implement a visible request mechanism. Privacy and child-advocacy organizations and attorneys enforcing the law will also be directly involved.

Why It Matters

The bill creates a narrow statutory right for a cohort of adults to reclaim control over monetized childhood images and videos, sets a compliance timeline for removal or editing, and attaches daily statutory damages—introducing both operational requirements for platforms and significant liability risk for family creators.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

SB 1247 builds a three-party process: the person who wants content removed (the child influencer, now an adult), the person who posted the content (a “vlogger” who is a parent, guardian, or family member and who received compensation), and the social media platform that must provide the request channel. The law applies only to image or video content that was paid content—posts where the vlogger received compensation for sharing material that features the claimant as a minor.

Under the bill, platforms must offer a clear, conspicuous mechanism through which a qualifying adult can identify specific paid content and submit a request to the poster to delete or edit it. Once the platform receives that request it has an operational duty to notify the vlogger so the poster knows a removal or edit has been asked for; the statute prescribes short, fixed windows for notification and for the poster’s compliance.If the vlogger refuses or fails to delete or edit the content so the claimant is no longer featured, the former child influencer may sue the vlogger in state court.

The available remedies include actual damages, statutory per-day damages, injunctive relief to force removal, and recovery of attorney’s fees and costs. When a court evaluates a case, the statute directs it to consider emotional harm, safety risks, loss of control over personal information, and damage to future opportunities.The bill does not create a private right of action against platforms and does not broadly ban family-produced content; it narrowly targets compensated posts that feature the claimant as a minor.

That focus shapes both how platforms will design intake systems and how litigants will frame proofs—e.g., demonstrating that the poster received compensation and that the claimant was featured as a minor at the time of posting.

The Five Things You Need to Know

1

Defines “child influencer” as someone 18 or older who was featured as a minor in paid image or video content on a social media platform.

2

Requires platforms to notify the vlogger of a removal/edit request after the claimant identifies the paid content through the platform’s mechanism.

3

Obligates the vlogger to delete or edit the paid content so the claimant is no longer featured within ten business days of receiving the platform’s notice.

4

Creates a private right of action against the vlogger that allows recovery of actual damages, injunctive relief, reasonable attorney’s fees and costs, and statutory damages of $3,000 for each day the vlogger remains in violation.

5

Directs courts to weigh emotional harm, safety risk, loss of control over personal information, and harm to future opportunities when determining relief.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 22583

Definitions – who and what the law covers

This section defines the core terms: “child influencer” (an adult who was featured as a minor in paid content), “paid content” (image or video shared by a vlogger for which the vlogger received compensation), “vlogger” (poster who is a parent, legal guardian, or family member), and adopts the statute’s definition of social media platform by cross-reference. These narrow definitions limit the law’s reach to monetized family posts rather than every photo or video of a minor.

Section 22583.1

Platform duties and takedown/edit workflow

This provision requires platforms to provide a clear, conspicuous mechanism for submitting requests and to notify the vlogger after receipt. It conditions the process on the claimant adequately identifying the paid content so the platform can locate and alert the poster. Practically, platforms must build intake, tracking, and notice systems and design UX that prevents misuse while enabling claimants to point to specific items in large feeds.

Section 22583.2

Civil remedy against vloggers and judicial considerations

This section establishes the private right of action and the relief available: actual damages, statutory damages of $3,000 per day, injunctive relief, and attorney’s fees and costs. It also lists factors a court should consider—emotional harm, safety risk, loss of control, and harm to future opportunities—framing how plaintiffs should plead damages and how judges should assess remedies. The statute targets the poster, not the platform, for statutory liability.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Privacy across all five countries.

Explore Privacy in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Former child influencers (now adults) who were featured in monetized family posts — they gain a statutory mechanism to ask for deletion or modification and a path to damages if the poster refuses, restoring control over childhood images that may harm their privacy or prospects.
  • Privacy and child-advocacy organizations — the law supplies a concrete enforcement tool and standards (e.g., the court’s listed harms) to support campaigns and individual representation.
  • Colleges, employers, and professional advisers — indirectly benefit because problematic monetized childhood content becomes easier to remove or neutralize, reducing reputational risk when evaluating candidates.

Who Bears the Cost

  • Vloggers who are parents, guardians, or family members and monetized family creators — they face an obligation to delete or edit paid posts within a short window and exposure to daily statutory damages and litigation costs if they decline.
  • Social media platforms — while not primary targets for statutory damages, platforms must design, implement, and operate a conspicuous intake and notification system, track requests, and potentially moderate disputes, which raises compliance and engineering costs.
  • Small family-run creator channels — many monetize modestly; the requirement to remove compensated posts can erase long-term content assets and revenue streams and may chill publishing of family-focused material.
  • State courts and defense counsel — the law’s daily-damages structure and subjective harms list may increase litigation volume and complexity, requiring judicial resources to resolve factual disputes about compensation, who qualifies as a family member, and whether plaintiffs were “featured” as minors.

Key Issues

The Core Tension

The central dilemma pits an adult’s interest in erasing or limiting monetized records of their childhood against a parent or family member’s control over content they created and monetized—and against platforms’ role as neutral operators expected to offer a functional intake system without bearing direct damages liability; the law favors reclaiming privacy but creates uncertain, potentially heavy legal exposure for family creators and vague operational questions for platforms.

The bill’s narrow scope—limited to paid content posted by parents, guardians, or family members—resolves some overbreadth concerns but creates multiple implementation questions. The statute does not define “compensation” beyond saying the vlogger received it; that leaves open whether nonmonetary benefits (gifts, product discounts, fame, channel growth) count.

The claimant must adequately identify the content so the platform can notify the poster, but the statute offers no verification process to prevent fraudulent claims or disputes over whether the identified item is the same content or a reposted copy held by third parties.

Liability and operational burden are bifurcated: platforms must run the intake/notice system but are not the target of statutory damages, while individual vloggers carry the legal risk. That split reduces direct platform exposure but creates friction where content is widely reshared, hosted off-platform, or embedded elsewhere.

The editing remedy is functionally ambiguous—does “no longer featured” permit cropping, blurring, or audio removal, and who decides adequacy? Finally, the statutory $3,000-per-day damages are potentially large and could produce outsized liability relative to the uploader’s revenue, raising questions about proportionality and likely incentivizing early settlement and aggressive litigation.

Several unresolved conflicts could arise in practice: interplay with copyright (the poster may own the recording), parental rights and family autonomy claims, and cross-border issues when platforms or reposters are outside California. The statute lists harms for the court to consider but gives little procedural guidance on proof standards for those harms.

Regulators and courts will need to interpret those elements and craft evidence rules, and platforms will have to decide how much verification to demand before forwarding notices—too light and they risk misuse, too strict and claimants face procedural barriers.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.