Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More
UK-based chip designer Arm offers the architecture for systems-on-a-chip (SoCs) that are used by some of the world’s largest tech brands, from Nvidia to Amazon to Google parent company Alphabet and beyond, all without ever manufacturing any hardware of its own — though that’s reportedly due to change this year.
And you’d think with a record setting last quarter of $1.24 billion in total revenue, it might want to just keep things steady and keep raking in the cash.
But Arm sees how fast AI has taken off in the enterprise, and with some of its customers delivering record revenue of their own by offering AI graphics processing units that incorporate Arm’s tech, Arm wants a piece of the action.
Today, the company announced a new product naming strategy that underscores its shift from a supplier of component IP to a platform-first company.
“It’s about showing customers that we have much more to offer than just hardware and chip designs. specifically — we have a whole ecosystem that can help them scale AI and do so at lower cost with greater efficiency,” said Arm’s chief marketing officer Ami Badani, in an exclusive interview with VentureBeat over Zoom yesterday.
Indeed, as Arm CEO Rene Haas told the tech news outlet Next Platform back in February, Arm’s history of creating lower-power chips than the competition (cough cough, Intel) has set it up extremely well to serve as the basis for power-hungry AI training and inference jobs.
According to his comments in that article, today’s data center consume approximately 460 terawatt hours of electricity per year, but that is expected to triple by the end of this decade, and could jump from being 4 percent of all of the world’s energy usage to 25 percent — unless more Arm power-saving chip designs and their accompanying optimized software and firmware are used in the infrastructure for these centers.
From IP to platform: a significant shift
As AI workloads scale in complexity and power requirements, Arm is reorganizing its offerings around complete compute platforms.
These platforms allow for faster integration, more efficient scaling, and lower complexity for partners building AI-capable chips.
To reflect this shift, Arm is retiring its prior naming conventions and introducing new product families that are organized by market:
Neoverse for infrastructure
Niva for PCs
Lumex for mobile …