The bill inserts a new clause into section 503(b) of the Federal Food, Drug, and Cosmetic Act clarifying that “practitioner licensed by law to administer such drug” can include artificial intelligence and machine‑learning (AI/ML) technologies when two conditions are met: (1) state law authorizes the AI to prescribe and (2) the AI/ML product is approved, cleared, or authorized under specified FDA device pathways (510(k), 513, 515, or 564).
This is a focused statutory change with outsized operational consequences. It creates a federal recognition pathway that could enable software and autonomous systems to generate legally valid prescriptions — but only where state scope‑of‑practice rules permit and the product has passed a specified FDA review route.
That mix shifts the regulatory questions from “can the FDA regulate prescribing AI?” to “how will states, the FDA, prescribers, pharmacies, and controlled‑substances rules interact when non‑human systems write prescriptions?”
At a Glance
What It Does
The bill amends FD&C Act §503(b) to state explicitly that an AI or machine‑learning technology counts as a ‘practitioner licensed by law to administer such drug’ if the State has authorized it to prescribe and the technology has FDA clearance, approval, or emergency authorization under sections 510(k), 513, 515, or 564.
Who It Affects
SaMD and digital therapeutics developers seeking to have their AI issue prescriptions, state medical licensing boards that set scope‑of‑practice rules, the FDA Office of Medical Devices reviewing software, pharmacies that must validate and dispense prescriptions, and clinicians who may supervise or be second‑lined by AI systems.
Why It Matters
By tying legal prescribing status to both state authorization and FDA device pathways, the bill creates a two‑track gate: states control who may prescribe, while the FDA determines which AI products meet safety and performance standards — potentially enabling autonomous prescribing in some jurisdictions and complicating cross‑state practice, liability, and controlled‑substances compliance.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The Healthy Technology Act of 2025 makes a narrow but consequential change to the Federal Food, Drug, and Cosmetic Act. It adds a new subsection to §503(b) — the provision that governs when a drug can be dispensed only by prescription — to say that ‘practitioner licensed by law to administer such drug’ can include artificial intelligence and machine‑learning technologies.
That inclusion is conditional: the technology must be authorized by the relevant State to prescribe, and it must have received FDA review through one of four device authorities listed in the amendment.
Practically, the bill ties two separate regulatory systems together. First, whether an AI can act as a prescriber depends on state law: medical boards, legislatures, or other state authorities must explicitly authorize non‑human prescribers under their scope‑of‑practice regimes.
Second, the AI product itself must have the appropriate federal device clearance, approval, or emergency authorization — the bill cites the common device pathways (510(k), device classification under 513, premarket approval under 515, and emergency use authorization under 564). In other words, federal recognition of AI as a prescriber is limited to AI products that the FDA has accepted through its existing device review frameworks.The statute does not define “artificial intelligence” or “machine learning,” nor does it spell out operational safeguards such as required human oversight, logging, or explainability.
It also does not amend statutes outside §503(b) — it leaves intact other federal requirements that bear on prescribing, such as DEA registrations for controlled substances, state pharmacy laws, malpractice standards, reimbursement rules, or patient consent requirements. Those other frameworks will determine whether a prescription produced by an AI can actually be filled, billed, or used in regulated contexts like controlled‑substance prescribing.Implementation will be messy.
States will vary in whether and how they authorize non‑human prescribers; the FDA will need to decide how its device review and postmarket surveillance frameworks apply to adaptive AI systems used to prescribe drugs; and providers, pharmacies, insurers, and law enforcement will have to reconcile practical questions about authentication, liability, and cross‑state use. The bill creates a legal opening for autonomous prescribing in places where both state authorization and FDA review align, but leaves many critical operational and legal details to other actors and future rulemaking.
The Five Things You Need to Know
The bill adds a new paragraph to 21 U.S.C. §503(b) explicitly including AI/ML technologies within the phrase “practitioner licensed by law to administer such drug” when state law authorizes prescribing by such systems.
An AI/ML technology qualifies only if it is authorized under the State’s statute to prescribe and also has been approved, cleared, or authorized by the FDA under one of these device pathways: 510(k), 513, 515, or 564.
The text does not define ‘artificial intelligence’ or ‘machine learning,’ leaving scope, capabilities, and limits of qualifying systems unspecified.
The amendment addresses the legal source of a valid prescription under the FD&C Act but does not change other federal or state rules that affect prescribing practice — for example, DEA controlled‑substance authority, pharmacy dispensing standards, or malpractice law.
By limiting federal recognition to FDA‑reviewed devices, the bill forces manufacturers to pursue device pathways (including PMA for high‑risk systems) if they want nationwide legal recognition — but state authorization remains a gating requirement for local use.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Short title — Healthy Technology Act of 2025
This is a conventional short‑title provision; it has no regulatory effect. It signals congressional intent to frame the change as enabling health‑technology integration, but the operative substance appears in the next section.
Adds AI/ML to the statutory definition of an eligible prescriber
The bill inserts paragraph (6) into §503(b), stating that a ‘practitioner licensed by law to administer such drug’ includes AI and machine‑learning technology so long as two conditions are satisfied: (A) the State has a statute authorizing the AI to prescribe, and (B) the technology is ‘approved, cleared, or authorized’ under specified FDA device authorities. Mechanically, this makes prescriptions issued by qualifying AI systems valid under the FD&C Act’s prescription requirement.
Specifies which FDA device authorities suffice
The amendment lists four authorities: 510(k) (pre‑market notification/clearance), section 513 device classification authorities, 515 (premarket approval/PMA), and section 564 (emergency use authorization). That selection covers common device review routes — from lower‑risk substantial equivalence clearances to high‑risk PMAs and EUAs — but does not reference guidance on Software as a Medical Device (SaMD) or FDA’s ongoing work on adaptive algorithms, leaving interpretation and review standards to the FDA and its reviewers.
Leaves open unanswered operational and legal questions
The amendment is narrow: it sets qualifying conditions but does not define key terms, require human oversight, set authentication standards for prescriptions, address interstate practice, or modify controlled‑substance law. Those gaps mean that state licensure regimes, DEA regulations, pharmacy practice acts, and malpractice frameworks will largely determine whether an AI‑issued prescription can be used in practice.
This bill is one of many.
Codify tracks hundreds of bills on Healthcare across all five countries.
Explore Healthcare in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- AI/SaMD developers and digital therapeutics companies: the bill creates a clear federal recognition pathway that ties legal prescribing status to FDA device review, reducing legal ambiguity for manufacturers whose products receive FDA clearance/approval.
- Health systems and telehealth companies seeking automation: organizations that deploy FDA‑reviewed AI could use these systems to expand access to prescription care where states permit, reducing clinician workload for routine prescribing.
- Patients in underserved or rural areas (conditional): where state law allows and an AI product is FDA‑cleared, patients could gain faster access to prescriptions through automated triage or decision support systems.
- Investors and device consultants: the statutory tie to FDA pathways clarifies the commercial route to market for prescriptive AI, making regulatory strategy and PMA/510(k) planning more central to product valuation.
Who Bears the Cost
- State medical boards and legislatures: they must develop or amend scope‑of‑practice laws, licensing rules, and enforcement mechanisms to authorize and oversee non‑human prescribers.
- FDA and its reviewers: the agency will face additional review and postmarket surveillance demands for AI products intended to prescribe, including questions about adaptive algorithms and lifecycle change control.
- Pharmacies and pharmacists: retail and institutional pharmacies will need processes to authenticate, validate, and document prescriptions issued by non‑human systems, increasing operational and compliance burdens.
- Clinicians and health‑care organizations: providers may face new liability risks and supervision obligations when delegating prescribing to AI, and must manage integration, oversight, and patient consent processes.
- Insurers and payers: payers will need to decide whether and how to reimburse for prescriptions initiated by AI, and to adjust utilization management and fraud‑prevention controls.
Key Issues
The Core Tension
The central dilemma is between enabling faster, scalable access to prescription care through AI and ensuring patient safety, accountability, and consistent regulatory control: the bill lowers one barrier (federal recognition of AI as a prescriber when FDA‑reviewed) while leaving intact and even intensifying practical and legal uncertainties that arise from state licensure variation, controlled‑substance regimes, liability allocation, and the technical challenges of regulating adaptive software.
The bill threads federal and state authority together but leaves the hard governance questions unresolved. By making an AI’s status as a ‘practitioner’ conditional on state authorization plus FDA review, it creates a patchwork: manufacturers must both secure federal device review and track a shifting array of state scope‑of‑practice rules.
That combination increases regulatory complexity and could fragment markets — a company with FDA clearance may be legally barred from enabling prescribing in states that do not explicitly authorize non‑human prescribers.
The amendment also punts on critical safety and accountability mechanisms. It does not require human supervision, define acceptable levels of explainability, establish audit or logging standards, or specify how continuously learning models should be regulated post‑clearance.
These omissions raise real risks: adaptive algorithms can change behavior after clearance, pharmacies and payers need reliable authentication to prevent fraud, and malpractice standards will struggle to incorporate non‑human decision‑makers. Separate federal frameworks — notably DEA rules for controlled substances, Medicare/Medicaid reimbursement rules, and state pharmacy statutes — will determine whether an AI‑issued prescription can be filled or paid for, which may yield inconsistent outcomes across jurisdictions.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.