Codify — Article

SB626 (SOCIAL MEDIA Act) requires platforms to run 24/7 law‑enforcement portals and standardized reporting

Creates an FTC Platform Safety Advisory Committee to set uniform metrics, mandates public platform reports, and makes noncompliance an FTC unfair-or-deceptive violation.

The Brief

The bill requires every social media platform to publish a dedicated law‑enforcement portal (including a 24/7 U.S.‑based call line and a homepage link), disclose whether it outsources law‑enforcement compliance, and explain its notice practices for investigations. It also creates a Federal Trade Commission Platform Safety Advisory Committee to design uniform reporting metrics on platforms’ handling of illegal content and their responsiveness to law enforcement, and requires platforms to submit annual public reports under those metrics.

Those reports feed enforcement: the FTC will treat failures to maintain the portal or to submit required reports as violations of the FTC Act’s unfair or deceptive acts and practices provisions. For compliance officers and platform operators, SB626 replaces ad hoc information flows with binding disclosure and reporting obligations and sets a timetable for FTC guidance, rulemaking, and public comparative reporting that could materially change platform operational and transparency requirements.

At a Glance

What It Does

SB626 obligates platforms to establish a publicly linked law‑enforcement portal with specified contact details and a U.S. 24/7 call center, and it tasks an 11‑member FTC advisory committee with developing uniform metrics that platforms must report on annually. The FTC will adopt guidance and can promulgate implementing rules; noncompliance is treated as an unfair or deceptive practice.

Who It Affects

Large and small social media platforms (as the bill defines them), third‑party vendors that handle law‑enforcement compliance, federal and state investigative agencies that rely on platform data, and organizations that track platform transparency. The FTC and law‑enforcement offices will receive standardized data for comparative analysis.

Why It Matters

The bill creates a formal, enforceable channel and data standard for platform–law‑enforcement interactions, potentially changing how platforms route investigations, document referrals for counterfeit or fentanyl‑related activity, and measure legal process response times. Standardized public reporting enables cross‑platform comparisons and regulatory scrutiny.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

SB626 forces social media platforms to stop relying on informal contacts and hidden procedures when law enforcement seeks assistance. Within 90 days of enactment platforms must publish a law‑enforcement portal and place a link on their homepages; the portal must list the lead contact, disclose outsourced compliance relationships, and provide policies about user notice during investigations.

The bill requires the portal’s phone number to reach a U.S.‑based call center staffed around the clock and an email address for inquiries from agencies.

To standardize what gets reported, the Federal Trade Commission must stand up a Platform Safety Advisory Committee composed of 11 members the FTC Chair appoints, including representatives from multiple federal law‑enforcement agencies, state and local investigators, a platform representative, transparency and victim‑advocacy representatives, and others named in the text. That committee must recommend reporting metrics — and produce a single, public comparative report at least one year after enactment and annually thereafter — so agencies and the public can compare how platforms detect, remove, and refer illegal content.The text lists sample metrics the committee must consider: counts of accounts promoting counterfeit substances or fentanyl (split by user reports vs. platform detection), procedures for referring accounts to law enforcement (including which agencies receive referrals and whether account holders are notified), average monthly receipt of legal process (2703(d) orders, subpoenas, warrants), and average non‑automated response time to subpoenas.

After the committee submits recommendations the Commission has 30 days to report to Congress which recommendations it accepts and 90 days to issue guidance to platforms; platforms then have 180 days after that guidance to file the mandated public reports.Enforcement falls to the FTC: violations of the portal or reporting requirements (or regulations under the Act) are treated as violations of the FTC Act’s prohibitions on unfair or deceptive acts or practices, subjecting platforms — including common carriers and nonprofits in certain respects — to the Commission’s full investigatory and remedial authority. The Act also directs the Commission to adopt any necessary rules under the Administrative Procedure Act framework.

The Five Things You Need to Know

1

Platforms must create a law‑enforcement portal and publish a homepage link within 90 days of enactment.

2

The portal must provide a U.S.‑based, 24/7 call center phone line and an email for law‑enforcement contact.

3

The FTC Chair will appoint an 11‑member Platform Safety Advisory Committee with multiagency, platform, NGO, and victim‑advocate representation to recommend uniform reporting metrics.

4

The Advisory Committee must produce a publicly available, cross‑platform comparative report one year after enactment and annually thereafter.

5

The FTC will treat failures to maintain the portal or submit required reports (or to follow implementing regulations) as violations of the FTC Act’s unfair or deceptive acts or practices provisions.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 1

Short title

Establishes the Act's formal name (SOCIAL MEDIA Act) and sets the stage for the subsequent operative provisions. This is purely nominal but required for citation and drafting consistency.

Section 2(a)

Law‑enforcement portal and homepage link

Mandates that each 'social media platform' create a law‑enforcement portal and put a link to it on the platform homepage within 90 days. Practically, this forces platforms to centralize contact points, making ad hoc email addresses or buried contacts insufficient for statutory compliance.

Section 2(b)

Portal content and availability requirements

Specifies the portal contents: a named lead contact, contact information for outsourced third‑party compliance providers (if any), and a phone number connecting to a U.S.‑based call center staffed 24/7 plus an email. It also requires clear publication of policies such as whether and when users receive notice of law‑enforcement investigations — collapsing several operational choices into discrete disclosures the platform must make publicly available.

4 more sections
Section 3(a)

FTC Platform Safety Advisory Committee: composition and duties

Creates an 11‑member advisory committee appointed by the FTC Chair with specified seats (FTC, DEA, FBI, USMS, DOJ Criminal Division, ICE HSI, state/local investigative reps, a platform rep, a transparency NGO rep, and a victims' advocate). The committee must recommend uniform reporting metrics, advise on updates, and publish an annual comparative report enabling cross‑platform benchmarking.

Section 3(b)–(c)

Metric adoption, guidance, and platform reporting timetable

Requires the Commission to tell Congress within 30 days which committee recommendations it will implement and to issue guidance to platforms not later than 90 days after receiving recommendations. Platforms then must deliver public metric reports within 180 days of that guidance and annually afterward. The metrics the committee must consider include account counts promoting counterfeit substances/fentanyl (by source of detection), referral practices to law enforcement, counts of legal process received, and average non‑automated subpoena response times.

Section 4

Enforcement by the Federal Trade Commission

Treats breaches of the portal or reporting requirements (or regulations issued under the Act) as violations of an FTC rule defining unfair or deceptive acts or practices, granting the FTC standard investigatory and remedial tools. The section explicitly extends enforcement reach to common carriers and some nonprofit organizations and preserves the Commission's other authorities.

Section 5

Definitions

Defines key terms used in the Act, including 'social media platform' (a broad definition covering sites/apps with accounts, user‑generated content, and advertising delivery), 'illicit activity', 'counterfeit substance', and references to controlled substances under federal law. The breadth of the 'social media platform' definition matters because it can sweep in marketplaces and other services that deliver ads or user content.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Technology across all five countries.

Explore Technology in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Federal, state, and local law‑enforcement agencies — they gain a single, discoverable contact point, standardized metrics for platform responsiveness, and annual comparative reports that improve situational awareness and cross‑platform comparability.
  • Victims and victim‑advocacy organizations — the statute guarantees a seat at the advisory table and public reporting that can surface platform behavior related to trafficking, child exploitation, and drug distribution, potentially improving referral and support pathways.
  • The Federal Trade Commission — receives structured data and statutory authority to enforce portal and reporting requirements, which strengthens its ability to hold platforms to consistent transparency and responsiveness standards.
  • Transparency and research organizations — public, comparable platform metrics enable independent analysis, benchmarking, and advocacy around platform practices for detecting and handling illicit content.

Who Bears the Cost

  • Social media platforms — must implement portals, operate or contract for a U.S. 24/7 call center, prepare and publish annual metric reports, and adjust operational processes to capture required data (development, staffing, legal, and compliance expenses).
  • Third‑party compliance vendors — platforms that outsource law‑enforcement functions must disclose vendor identities and could face contractual and reputational pressure, increasing diligence and contractual compliance costs.
  • FTC and member agencies — the Commission must run an advisory committee, review recommendations, produce guidance, and undertake rulemaking and enforcement actions, creating administrative and budgetary demands that may not be fully funded by the statute.
  • Smaller or specialized platforms and online marketplaces — the broad statutory definition could pull in services without existing law‑enforcement pathways, imposing disproportionate compliance and reporting burdens on entities with limited resources.

Key Issues

The Core Tension

The bill pits the public‑safety interest in standardizing and exposing platform responsiveness to illegal content against operational, privacy, and resource constraints: improving data for law enforcement and public scrutiny requires intrusive operational transparency and sustained compliance costs that platforms and regulators must operationalize without creating new security or privacy risks.

SB626 centralizes contact and data but leaves open significant implementation questions. The bill prescribes a high‑level set of metrics and a timetable for committee recommendations and Commission guidance, but it does not specify technical standards, data formats, or evidentiary thresholds; that gap puts heavy weight on the advisory committee and the FTC’s rulemaking to define usable, interoperable measures.

Without careful metric design, platforms could report inconsistent or noncomparable figures (for example, differing methods for counting an 'account' or classifying a 'referral'), producing the appearance of comparability without meaningful substance.

The Act also creates privacy and operational trade‑offs. Public comparative reporting improves transparency but risks revealing law‑enforcement workflows, internal detection algorithms, or operational cadence that could be exploited by bad actors.

Mandating a U.S.‑based, 24/7 call line and disclosure of third‑party compliance vendors increases accountability but also raises security and confidentiality concerns about how sensitive investigative contacts and referrals are handled. Finally, the wide definition of 'social media platform' could extend obligations to marketplaces and niche services, imposing compliance costs that may drive design or business changes in different directions.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.