Amazon challenges competitors with on-premises Nvidia ‘AI Factories’

by | Dec 2, 2025 | Technology

Amazon announced a new product Tuesday called “AI Factories” that allows big corporations and governments to run its AI systems in their own data centers. Or as AWS puts it: Customers supply the power and the data center, and AWS plunks in the AI system, manages it, and can tie it into other AWS cloud services.

The idea is to cater to companies and governments concerned with data sovereignty, or absolute control over their data so it can’t wind up in a competitor’s or foreign adversary’s hands. An on-prem AI Factory means not sending their data to a model maker and not even sharing the hardware.

If that product name sounds familiar, it should. That’s what Nvidia calls its hardware systems that are chock-full of tools needed to run AI, from its GPU chips to its networking tech. This AWS AI Factory is, in fact, a collaboration with Nvidia, both companies say.

In this case, the AWS Factory will use a combination of AWS and Nvidia technology. Companies that deploy these systems can opt for Nvidia’s latest Blackwell GPUs or Amazon’s new Trainium3 chip. It uses AWS’ homegrown networking, storage, databases, and security and can tap into Amazon Bedrock — the AI model selection and management service, and AWS SageMaker AI, the model building and training tool.

Interestingly, AWS is far from the only giant cloud provider installing Nvidia AI Factories. In October, Microsoft showed off its first of many-to-come AI Factories rolling out into its global data centers to run OpenAI workloads. Microsoft didn’t announce at the time that these extreme machines would be available for private clouds. Instead, Microsoft highlighted how it was leaning on a host of Nvidia AI Factory data center tech to build and connect its new “AI Superfactories,” aka new state-of-the-art data centers being built in Wisconsin and Georgia.

Last mo …

Article Attribution | Read More at Article Source