Google drops pledge on AI use for weapons

by | Feb 5, 2025 | Top Stories

Alphabet, the parent company of technology giant Google, is no longer promising that it will never use artificial intelligence (AI) for purposes such as developing weapons and surveillance tools.The firm has rewritten the principles guiding its use of AI, dropping a section which ruled out uses that were “likely to cause harm”.In a blog post Google senior vice president James Manyika, and Demis Hassabis, who leads the AI lab Google DeepMind, defended the move.They argue businesses and democratic governments need to work together on AI that “supports national security”.There is debate amongst AI experts and professionals over how the powerful new technology should be governed in broad terms, how far commercial gains should be allowed to determine its direction, and how best to guard against risks for humanity in general.There is also controversy around the use of AI on the battlefield and in surveillance technologies.The blog said the company’s original AI principles published in 2018 needed to be updated as the technology had evolved. “Billions of people are using AI in their everyday lives. AI has become a general-purpose technology, and a platform which countless organisations and individuals use to build applications.”It has moved from a niche research topic in the lab to a technology that is becoming as pervasive as mobile phones and the internet itself,” the blog post said.As a result baseline AI principles were also being developed, which could guide common strategies, it said.However, Mr Hassabis and Mr Manyika said the geopolitical landscape was becoming increasingly complex.”We believe democracies should lead in AI development, guided by core values like freedom, equality and respect for human rights,” the blog post said. “And we believe that companies, governments and organisations sharing these values should work together to create AI that protects people, promotes global growth and supports national security.”The blog post was published ju …

Article Attribution | Read More at Article Source