The bill amends 38 U.S.C. §527 to require the Secretary of Veterans Affairs to prescribe regulations establishing standard practices for any grant or pilot program related to suicide prevention or mental health carried out through the Veterans Health Administration. Those practices must set measurable objectives, require a documented methodology for collecting and analyzing evaluation data, create criteria for scaling or ending programs, mandate stakeholder communication, and require post‑program evaluations and dissemination of results.
This is an operational bill, not an appropriation: it standardizes how the VA designs, evaluates, and shares findings from suicide‑prevention pilots and grants. For compliance officers, program managers, and community partners, the bill replaces ad hoc evaluation approaches with a single regulatory baseline—potentially improving comparability across projects but also shifting implementation and reporting responsibilities onto VA and grantees without new dedicated funding.
At a Glance
What It Does
The bill adds a new subsection to 38 U.S.C. §527 directing the Secretary to write regulations that establish standard practices for VHA suicide‑prevention and mental‑health grant and pilot programs. Required practices include objective setting, an evaluation methodology, criteria for expansion, stakeholder communication, end‑of‑program evaluation, and sharing of findings.
Who It Affects
Directly affects the Veterans Health Administration, VA program managers, federal grant recipients and pilot sites focused on veteran suicide prevention or mental health, and external partners the Secretary designates as relevant (for example, academic evaluators or community providers).
Why It Matters
It creates a single regulatory baseline for how VHA pilots and grants must be evaluated and communicated—improving the odds that programs can be compared, scaled, or terminated based on evidence. At the same time, it places new procedural and reporting obligations on VA and grantees and could change how quickly new pilots are stood up.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The bill rewrites the substance of how the Department of Veterans Affairs runs, evaluates, and shares results from suicide‑prevention and mental‑health pilots and grants inside the VHA. Rather than leaving evaluation design and reporting to individual program offices or grantees, it forces the Secretary to create a standard set of practices that all such programs must follow unless another law blocks it.
Those practices are procedural: set measurable objectives up front, spell out what data are needed and how it will be collected, and describe how results will be analyzed and used.
The regulations must require communication with stakeholders the Secretary identifies at three points: during program development, at least 30 days before the program starts, and throughout the program’s life. At the end of each program, the VA must conduct an evaluation that validates lessons learned and assesses whether the program’s results generalize beyond the specific pilot context.
The VA must also share program results and best practices with relevant entities.The bill sets two administrative constraints that shape compliance: the Secretary must issue the implementing regulations within 180 days of enactment, and those standards apply to any relevant grant or pilot regardless of when it began. The combination—tight regulatory timing plus retroactive applicability—means existing projects will face a near‑term compliance deadline to align with the new baseline.
The statute does not authorize new funds or outline enforcement mechanisms for noncompliance; it imposes procedural requirements through the regulatory process instead.
The Five Things You Need to Know
The bill adds a new §527(b) to 38 U.S.C. requiring the Secretary to establish standard practices for VHA suicide‑prevention and mental‑health grants and pilots.
The regulations must require clear, measurable objectives and a documented methodology covering what evaluation information to collect, how to collect it, and the analysis approach.
The rules must include criteria or standards for deciding whether a pilot should be expanded, extended, or made permanent.
The Secretary must promulgate the regulations within 180 days of enactment, and the standards apply to existing and future programs alike.
The statute mandates stakeholder communication during development, at least 30 days before program start, and throughout the program, plus an end‑of‑program evaluation assessing lessons learned and generalizability.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Short title
Designates the Act as the "What Works for Preventing Veteran Suicide Act." This is purely nominal but signals the bill’s focus on evidence and replication rather than on funding or substantive clinical directives.
Adds regulatory duty and enumerates required practices
Substantively inserts a new subsection (b) into §527 making the Secretary responsible for prescribing regulations that set standard practices for any VHA grant or pilot program related to suicide prevention or mental health. The provision lists discrete obligations: measurable objectives; a methodology and plan covering what data to collect, sources/methods/timing/frequency, analysis approach, and criteria for expansion; stakeholder communication; end‑of‑program evaluation for lessons learned and generalizability; and sharing of results. Practically, this moves design and evaluation specifications from ad hoc program guidance into a single, binding regulatory framework across the VHA.
180‑day deadline for issuing regulations
Requires the Secretary to publish the regulations within 180 days of the Act’s enactment. That timeline compresses the rulemaking and implementation planning cycle, forcing the VA to define required evaluation standards quickly. The short deadline raises implementation logistics: internal rule drafting, consultation with stakeholders, and potential need for interim guidance to affected programs.
Retroactive application to existing programs
Specifies that the prescribed standard practices apply to a grant or pilot program without regard to when it was established. This makes the regulation effectively retroactive in application, compelling ongoing or completed pilots to meet the new baseline for documentation, evaluation, and reporting where feasible.
This bill is one of many.
Codify tracks hundreds of bills on Veterans across all five countries.
Explore Veterans in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Veterans at risk of suicide — Improved program design and mandatory evaluation increase the likelihood that effective interventions are identified and scaled, and ineffective ones are stopped.
- Program evaluators and researchers — Uniform data‑collection and analysis requirements create more comparable datasets and clearer research questions, easing meta‑analysis and replication studies.
- VA leadership and policymakers — A consistent regulatory framework supplies standardized evidence to inform resourcing and scaling decisions across the VHA and strengthens accountability to Congress and stakeholders.
- Community provider partners and grantees — When programs are evaluated with shared standards, proven practices can be adopted more quickly across local providers, improving coordination and reducing duplication.
Who Bears the Cost
- Veterans Health Administration and VA staff — Drafting, implementing, and enforcing regulations within 180 days will require staff time, process changes, and likely training; those are administratively costly and are not funded by the bill.
- Grant recipients and pilot sites — Programs will need to implement new data collection, analysis, and reporting procedures, which may require hiring evaluation capacity or contracting with third parties.
- Small community organizations and subcontractors — Smaller partners may struggle to meet new methodological and reporting standards without technical assistance or budget increases, potentially reducing their ability to participate.
- Program timelines and innovation speed — The need to meet standardized evaluation criteria and pre‑start communication windows may delay the launch of pilots and reduce the flexibility to make rapid, iterative changes during implementation.
Key Issues
The Core Tension
The bill tries to resolve a common policy dilemma: require rigorous, comparable evaluation to know "what works" versus preserve the speed and flexibility needed to deploy and iterate interventions for an urgent public‑health problem. Standardization improves evidence and scaling decisions but can slow pilots, increase costs for community partners, and constrain adaptive, localized solutions—especially when the statute imposes a tight regulatory timeline and applies rules to existing programs without funding support.
The bill demands standardized evaluation and transparency but leaves multiple implementation choices vague. "Entities determined relevant to the program by the Secretary" is an open-ended designation that gives the Secretary discretion to include or exclude stakeholders; that helps administrative flexibility but raises questions about inclusivity and which external parties (researchers, community partners, state agencies) will see draft methodologies and results. The clause "unless otherwise prohibited by law" correctly preserves privacy and other legal limits, but it also creates uncertainty about what data can be shared or required for evaluation without explicit crosswalks to HIPAA, the Privacy Act, and other federal confidentiality rules.
The statute requires a short, 180‑day regulatory timeline and applies standards retroactively to existing programs. Those two features prioritize quick normalization of evaluation practices but may impose significant compliance costs on ongoing pilots and grantees.
A central technical challenge the VA will face is outcome measurement: suicide is a low‑base‑rate event, so evaluations typically rely on intermediate outcomes (engagement, access, behavioral markers) and advanced designs to infer impact. The Act does not define acceptable metrics or analytic standards (e.g., minimum sample sizes, use of control groups), nor does it provide funding for capacity building, which risks uneven application or the exclusion of smaller providers who cannot meet rigorous evaluation requirements.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.