Microsoft, Google, xAI give US access to AI models for security testing

by | May 5, 2026 | World

The deal comes days after the Pentagon announces an agreement with seven tech giants to use AI in classified systems.By AP and ReutersPublished On 5 May 20265 May 2026Tech giants Microsoft, Google and xAI say they will allow the United States federal government access to their new artificial intelligence models for national security testing.The Center for AI Standards and Innovation (CAISI) at the Department of Commerce announced the agreement on Tuesday amid increasing concerns about the capabilities that Anthropic’s newly unveiled Mythos model could give hackers.Recommended Stories list of 4 itemsend of listUnder the new agreement, the US government will be allowed to evaluate the models before deployment and conduct research to assess their capabilities and security risks.The agreement fulfils a pledge the administration of US President Donald Trump made in July to partner with technology companies to vet their AI models for “national security risks”.Microsoft will work with US government scientists to test AI systems “in ways that probe unexpected behaviors”, the company said in a statement. Together they will develop shared data sets and workflows for testing the company’s models, the company said.Microsoft signed a similar agreement with the United Kingdom’s AI Security Institute, according to the statement.Concern is growing in Washington over the national security risks posed by powerful AI systems. By securing early access to frontier models, US officials are aiming to identify threats ranging from cyberattacks to military misuse before the tools are widely deployed.The development of advanced AI systems, including Anthropic’s Mythos, in recent weeks has created a stir globally, including among US officials and corporate America, over their ability to supercharge hackers. Advertisement “Independent, rigorous measurement science is essential to understanding frontier AI …

Article Attribution | Read More at Article Source