The bill tasks the National Institute of Standards and Technology (NIST) with developing standards and guidelines to make open Government data assets “artificial intelligence-ready.” NIST must consult Commerce, OSTP, OMB and other agencies, publish proposed standards for at least 60 days of public comment, and finalize standards within one year of enactment. The standards must cover data quality, stewardship, metadata, documentation, IP-management when federal data is combined with proprietary data, machine-readable formats, tools to parse datasets, and privacy protections.
Once NIST issues the standards, the Director must submit them to OSTP; the President, acting through OSTP and in consultation with OMB, must then require each agency head to adopt the standards. The bill also amends the National Science and Technology Policy Act to force agency adoption, require interoperability across agency systems, and require agencies to account for the standards in major IT and high-performance computing procurements.
A targeted NOAA provision directs the agency to prepare its operational forecasting datasets for AI/ML integration and to brief Congress annually for five years.
At a Glance
What It Does
Requires NIST to develop baseline, adaptable standards that make open federal datasets usable for AI development, including formats, metadata, data-quality practices, IP guidance, and privacy safeguards. The standards must be published for public comment and updated at least every two years.
Who It Affects
Federal data stewards across executive agencies, NIST, OSTP and OMB, AI developers that rely on public datasets, vendors supplying data-processing tools, and NOAA’s operational forecasting programs.
Why It Matters
Establishes a government-wide baseline for making federal data consumable by AI systems, ties those standards into procurement and IT acquisition, and creates a specific mandate to ready NOAA forecasting data—shaping both public-sector data publishing and private-sector model training inputs.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The bill inserts a new section into the NIST Act directing the NIST Director to develop standards and guidelines so that open Government data assets are useable by AI developers. NIST must consult Commerce, OSTP, OMB and other relevant agencies and produce baseline standards that agencies can adapt to mission needs while preserving interoperability.
Those standards must address practical topics—data quality, stewardship, metadata, documentation, and handling intellectual property when federal and proprietary data are combined.
NIST’s output must define what “artificial intelligence-ready” means based on developer needs and public input gathered through notice-and-comment. The law requires datasets to be available for download via public websites, web-scraping, or other practicable methods; to be accurate as of publication; human-readable; provided in open, machine-readable formats with publicly available tooling to decode or process them; and to include privacy protections.
NIST must also recommend measurements to evaluate whether agencies are improving dataset utility for AI.After NIST finalizes the standards, it must send them to OSTP; the President, through OSTP and coordinating with OMB, must issue a requirement that agency heads adopt the standards. Agencies must then adopt the standards, keep agency-specific adaptations interoperable across federal systems, and make sure major IT and high-performance computing procurements explicitly account for standard requirements.
NIST must publish initial standards within one year of enactment and review and consider revisions at least every two years.Finally, the bill includes a domain-specific rule for the National Oceanic and Atmospheric Administration: once the government-wide standards are adopted, NOAA must ensure its datasets for analyses, forecasts, in-situ and satellite observations, and other critical environmental observations support AI/ML integration for operational forecasting. The NOAA Under Secretary must brief two congressional committees annually for five years on implementation progress.
The Five Things You Need to Know
NIST must publish proposed standards and a request for feedback in the Federal Register and provide at least 60 days for public comment before finalizing.
NIST must deliver final standards to OSTP within 1 year of enactment; the standards must be reviewed for revision at least every 2 years thereafter.
Standards must require open Government data assets, to the greatest extent practicable, be downloadable (including by web-scraping), accurate on publication, human-readable, available in open machine-readable formats, and accompanied by publicly available tools to process them.
OSTP, working with OMB and the President, must issue a mandatory requirement that heads of federal agencies adopt the standards; agencies must keep any agency-specific adaptations interoperable and account for the standards in major IT and high-performance computing acquisitions.
NOAA must adapt its operational forecasting datasets (model outputs, in‑situ and conventional observations, satellite datasets, and critical environmental observations) for AI/ML use and brief Congress annually for five years on progress.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
NIST tasked to develop AI-ready standards for open government data
This provision gives NIST the explicit mandate to create baseline standards and guidelines to make open Government data assets AI-ready. It prescribes consultation with Commerce, OSTP, OMB and other agencies, and requires the standards to cover interoperability, data quality, stewardship, metadata, documentation, and IP-management when federal and proprietary data are combined. Practically, this centralizes standard-setting at NIST while allowing agencies to tailor the baseline to mission-specific needs.
Minimum technical and access requirements for datasets
The bill lists minimum expectations for open datasets: availability for download (including by web-scraping), publication accuracy, human readability, open machine-readable formats, and accompanying public software tools. It also requires that datasets be secure and protective of individual privacy. These are functional obligations rather than prescriptive file formats, leaving room for technological evolution while setting clear publication behaviors agencies must adopt.
Notice-and-comment, timelines, and recurring reviews
NIST must publish proposed standards and take public comments for at least 60 days; comments must be publicly accessible and considered in finalizing standards. NIST must publish final standards in the Federal Register within one year of enactment and then consider revisions no less frequently than every two years, using the same notice-and-comment approach. This creates a recurring rulemaking rhythm and a formal feedback loop with stakeholders.
OSTP/OMB/President make agency adoption mandatory and link to procurement
After NIST submits standards to OSTP, the President (via OSTP and in consultation with OMB) must issue a requirement directing agency heads to adopt them. Agency heads must implement the standards, ensure agency-specific adaptations remain interoperable across federal systems, and make sure major IT and high-performance computing acquisitions explicitly account for the standards. This ties standard adoption to both executive direction and procurement decisions.
NOAA-specific AI/ML readiness for operational forecasting
Once government-wide standards are adopted, NOAA’s Under Secretary must ensure NOAA’s datasets (model analyses/forecasts/reanalyses, in‑situ and conventional observations, satellite datasets, and environmental observations critical for forecasting) are AI/ML-ready for operational forecasting. NOAA must brief the Senate Commerce Committee and the House Science Committee annually for five years on implementation progress, creating a focused delivery and oversight mechanism for environmental data.
This bill is one of many.
Codify tracks hundreds of bills on Technology across all five countries.
Explore Technology in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- AI developers and modelers: Gain higher-quality, machine-readable federal datasets and accompanying tooling that reduce preprocessing time and lower barriers to training and validating models.
- Researchers and academic institutions: Get clearer metadata and documented provenance, improving reproducibility and enabling more rigorous scientific analysis using federal datasets.
- NOAA operational forecasting programs and emergency managers: Receive datasets structured for AI/ML integration, which can accelerate development of improved forecasting tools and decision-support systems.
- Data-tool vendors and intermediaries: New market opportunities to build ingest, validation, and conversion tools aligned to NIST standards and to offer services that help agencies comply.
Who Bears the Cost
- Federal agencies (data stewards): Face the direct costs of remediating, reformatting, documenting, and securing existing datasets to meet standards, plus staff time for interoperability work and procurement adjustments.
- NIST, OSTP, and OMB: Take on sustained rulemaking, coordination, and oversight responsibilities, requiring staff and technical resources to run the notice-and-comment cycles and interagency reviews.
- Procurement teams and IT contractors: Must adapt acquisition specifications and deliverables to explicitly include NIST standard requirements, which could increase procurement complexity and up-front costs.
- Taxpayers/legislative appropriators: May indirectly fund the remediation and tooling work if agencies request budget increases to comply with the new mandates.
Key Issues
The Core Tension
The central dilemma is maximizing the utility of federal data for AI (speeding innovation and interoperability) while protecting privacy, intellectual property, and mission-specific constraints—and doing so without a clear funding or enforcement framework; making data easier to use for models imposes real conversion and governance costs on agencies that the bill does not directly resolve.
The bill attempts to balance a uniform federal baseline with agency flexibility, but it leaves several operational questions open. Key terms — notably “artificial intelligence-ready” — are to be defined by NIST based on stakeholder input, which gives NIST discretion but creates uncertainty for agencies planning budgets and timelines.
The requirement that datasets be available via web-scraping and other methods improves accessibility but could clash with agency security practices, contractual restrictions, or third-party licensing arrangements attached to derivative or co-produced datasets. The mandate to provide publicly available software tools to decode or process data is practical for adoption but raises maintenance and support expectations that agencies or vendors must meet.
Another tension is fiscal: the bill imposes an unfunded compliance burden. Agencies with large legacy holdings (e.g., scientific archives, sensor networks) will need personnel and technical investment to convert formats, add metadata, and integrate privacy-preserving measures; the text does not create a funding stream or timeline for agencies to request appropriations.
Measuring success is also underspecified: the bill requires NIST to recommend measurements but leaves open who enforces, audits, or sanctions noncompliance and how progress will be scored across heterogeneous agencies. Lastly, public comment processes can advantage well-resourced stakeholders, potentially skewing definitions and technical choices toward incumbent private-sector preferences rather than smaller researchers or civil-society priorities.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.