The Center for AI Safety is urging the U.S. government to require security-focused safety reviews for AI labs seeking federal contracts [1].

This proposal seeks to prevent national security risks by ensuring advanced AI models cannot be misused or present threats to the state [1]. By tying government funding and contracts to safety benchmarks, the advocacy group aims to create a financial incentive for labs to prioritize security over speed.

Under the proposed framework, AI developers that spend more than $100 million annually on compute would be required to undergo the review [4]. Labs that fail to meet these safety standards would be barred from receiving lucrative government contracts [1].

The group said the measure is necessary to mitigate the risks posed by increasingly powerful AI systems. Because the federal government is a primary purchaser of technology, the group said procurement rules are an effective lever for enforcing safety standards across the industry [1].

The proposal focuses specifically on security-focused reviews rather than general ethics guidelines. This approach targets the potential for models to be weaponized or used for cyberattacks, a critical concern for federal agencies managing sensitive infrastructure [2].

While the U.S. government has previously issued executive orders regarding AI safety, this proposal would move the requirement into the binding legal territory of procurement contracts [3]. This would shift the burden of proof to the developers to demonstrate their systems are safe before they can access public funds [1].

AI labs must pass a security-focused safety review before they can be eligible for U.S. government contracts.

This move represents a shift toward using the U.S. government's purchasing power as a regulatory tool. By targeting labs with high compute spend, the Center for AI Safety is focusing on the 'frontier' models most likely to possess dangerous capabilities, potentially creating a tiered system where only the most secure labs can partner with the federal government.