Codify — Article

AWARE Act directs FTC to produce AI chatbot safety resources for minors

Mandates the Federal Trade Commission to publish consumer-facing guidance within 180 days to help parents, educators, and children spot unsafe AI chatbot use and understand privacy risks.

The Brief

The AWARE Act requires the Federal Trade Commission to develop and publish educational resources for parents, educators, and minors about the safe and responsible use of AI chatbots. The statute sets a firm 180-day deadline, asks the FTC to consult relevant federal agencies, and prescribes three topic areas: identifying safe versus unsafe chatbot use, privacy and data collection practices, and supervisory best practices for parents.

This bill is narrowly focused and non‑regulatory: it creates a federal consumer‑education mandate rather than new prohibitions or liabilities for companies. It matters because it produces a single, federal set of materials that schools, districts, and parents may rely on — and because the bill’s definitions and deadline shape how broadly those materials will reach and how quickly they must be produced.

At a Glance

What It Does

The bill directs the Federal Trade Commission to develop and make publicly available educational resources about AI chatbots aimed at parents, educators, and minors. It requires those resources to address how to identify safe and unsafe use, privacy and data collection practices, and parental best practices, and it sets a 180‑day delivery deadline after enactment.

Who It Affects

Directly affects the FTC (which must produce the materials), parents and legal guardians, K–12 educators and school administrators, and minors under 18. AI developers are not assigned new legal duties by the text, but schools and advocacy groups are likely to adopt the FTC materials.

Why It Matters

This creates a federal, standardized source of guidance on a high‑visibility technology for children, which can quickly become the practical baseline for schools and families. Because the measure is educational rather than regulatory, its real-world effect will depend on how the FTC designs the materials, whom it consults, and how widely institutions use them.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

The AWARE Act imposes a short, specific task on the Federal Trade Commission: within 180 days of enactment, the FTC must produce and post educational resources for parents, educators, and minors about interacting safely with AI chatbots. The bill requires the agency to consult “relevant Federal agencies” while developing the materials, but it does not specify which agencies, how consultation must occur, or whether public input is required.

The statute sets three required subject areas for the resources: (1) how to identify safe and unsafe AI chatbot use; (2) privacy and data collection practices; and (3) best practices for parents supervising minors. The bill leaves format and distribution open: the FTC could produce webpages, printable handouts, classroom lesson plans, videos, or multi‑language guides, but the agency must decide what mix fits the target audiences and the 180‑day timeline.The bill narrows what counts as an AI chatbot by tying the term to the National Artificial Intelligence Initiative Act definition of artificial intelligence and adding a functional test: a system marketed to consumers that engages in interactive, natural‑language communication and generates or selects content in response to user inputs using conversational context.

That wording excludes non‑consumer, non‑conversational, or narrowly task‑specific systems, and it defines minors as under 18 and parents to include legal guardians.Finally, the AWARE Act instructs the FTC to model the materials on the Commission’s “Youville” program referenced in the bill. The statute is an education mandate only: it does not create new enforcement authority against companies, impose labeling requirements, or appropriate funding for the FTC’s work.

How useful the final materials become will hinge on choices the FTC makes about format, distribution partners, accessibility, and whether the agency treats these resources as a starting point for stronger regulatory work or as a stand‑alone public service.

The Five Things You Need to Know

1

The FTC must develop and publish educational resources for parents, educators, and minors about AI chatbots within 180 days of enactment.

2

The agency must consult ‘relevant Federal agencies’ during development, but the bill does not list which agencies or require public comment.

3

Required content areas are limited to: identifying safe vs. unsafe chatbot use; privacy and data collection practices; and parental best practices for supervision.

4

The resources must be modeled on the Commission’s ‘Youville’ program as the template the FTC should follow.

5

The statute defines ‘AI chatbot’ as a consumer‑marketed system that uses interactive, natural‑language conversation and generates or selects content in response to user inputs; ‘minor’ is anyone under 18 and ‘parent’ includes legal guardians.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 1

Short title: AWARE Act

This short provision simply names the law the ‘AI Warnings And Resources for Education Act’ or ‘AWARE Act.’ It has no operative effect beyond signaling the bill’s purpose and making references to the act concise in any implementing guidance or references.

Section 2(a)

FTC must develop and publish educational resources (180‑day deadline)

This subsection is the core operative command: the Federal Trade Commission must develop and make available to the public educational resources for parents, educators, and minors on safe and responsible AI chatbot use, and it sets a 180‑day deadline from enactment. Practically, the FTC will need to scope the project, decide formats and languages, coordinate with partners (if any), and prioritize delivery to meet the statutory clock.

Section 2(b)

Required contents: safety, privacy, parental supervision

The bill prescribes three topic buckets the FTC must cover: (1) how to identify safe and unsafe chatbot use; (2) privacy and data collection practices; and (3) best practices for parents supervising minors. That constrained list narrows the agency’s obligation to education on behavior and privacy rather than broader topics such as misinformation mitigation, age‑gating technology, or platform practices — though the FTC could choose to contextualize those issues within the required topics.

1 more section
Section 2(c) and 2(d)

Modeling and definitions

Subsection (c) mandates that the FTC model materials on the Commission’s ‘Youville’ program named in the bill, which directs the agency to follow a specific in‑house example rather than invent an entirely new approach. Subsection (d) supplies definitions: it imports the NAII Act definition of ‘artificial intelligence,’ sets a functional definition of ‘AI chatbot’ tied to consumer‑marketed, conversational systems, defines ‘minor’ as under 18, and makes ‘parent’ include legal guardians. Those definitional choices control which devices and which audiences fall under the guidance.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Technology across all five countries.

Explore Technology in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Parents and legal guardians — receive a federally produced, centralized set of talking points and supervisory strategies to manage children’s interactions with conversational AI, reducing the need to vet materials themselves.
  • K–12 educators and school administrators — gain a ready‑made resource set they can adopt or adapt for classroom digital‑literacy lessons and district policies without building materials from scratch.
  • Minors (students) — stand to benefit from age‑appropriate guidance on privacy and safe use that could improve digital literacy and awareness of data collection risks.
  • Child‑safety and consumer advocacy groups — receive an authoritative federal reference they can amplify and integrate into outreach, making coordinated messaging easier.
  • The FTC — gains a clear statutory mandate to produce consumer education on this technology, which can strengthen its public‑education portfolio and shape the baseline discourse.

Who Bears the Cost

  • Federal Trade Commission — must allocate staff time and resources to research, produce, and distribute materials within a tight 180‑day window, with no appropriation specified in the bill.
  • Other federal agencies consulted — agencies asked to consult may need to dedicate subject‑matter experts and time without additional funding or a defined consultation process.
  • School districts and educators — while not mandated, they will likely need to invest time to review, adapt, and incorporate the FTC materials into curricula or parent communications.
  • Parents and guardians — will need to spend time learning and possibly implementing the recommended supervisory practices the materials promote, which could be burdensome for families with limited bandwidth.

Key Issues

The Core Tension

The central tension is between delivering fast, practical guidance that parents and schools can use immediately and the reality that education alone cannot substitute for structural safeguards: the bill empowers the FTC to teach families about risks but does not require platform changes or fund comprehensive educational rollouts, so the statute risks producing useful advice without addressing the systemic practices that create those risks.

The bill creates only an educational requirement, not regulatory or enforcement tools. That limits direct concrete protections: the FTC can educate about privacy and unsafe behavior, but it cannot, under this text, require companies to change data practices or impose penalties.

The effect of the statute therefore depends heavily on the quality, reach, and uptake of the materials the FTC produces.

Key implementation questions remain unresolved. The bill requires consultation with “relevant Federal agencies” but does not identify them or set a process, leaving open whether the department of education, HHS, FCC, or others will meaningfully shape content.

The 180‑day timeline pressures the FTC to move quickly, which risks favoring broadly framed materials over nuanced, evidence‑based curricula. The statutory instruction to model the materials on the Commission’s ‘Youville’ program gives the agency a template but offers no detail about what characteristics of that program to replicate.

Finally, the definitions — particularly limiting an “AI chatbot” to systems marketed to consumers that use conversational context — exclude many backend AI tools and could leave gaps between the guidance and the full range of AI interactions minors experience.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.