Google announces Axion, its first Arm-based CPU for data centers

Date:

Share:


Google Cloud Next 2024 has begun, and the company is starting the event with some big announcements, including its new Axion processor. It’s Google’s first Arm-based CPU specifically created for data centers, which was designed using Arm’s Neoverse V2 CPU.

According to Google, Axion performs 30 percent better than its fastest general purpose Arm-based tools in the cloud and 50 percent better than the most recent, comparable x86-based VMs. They also claim it’s 60 percent more energy efficient than those same x86-based VMs. Google is already using Axion in services like BigTable and Google Earth Engine, expanding to more in the future.

The release of Axion could bring Google into competition with Amazon, which has led the field of Arm-based CPUs for data centers. The company’s cloud business, Amazon Web Services (AWS), released the Graviton processor back in 2018, releasing the second and third iterations over the following two years. Fellow chip developer NVIDIA released its first Arm-based CPU for data centers in 2021 named Grace, and companies like Ampere have also been making gains in the area.

Google has been developing its own processors for several years now, but they’ve been primarily focused on consumer products. The original Arm-based Tensor ship first shipped in the Pixel 6 and 6 Pro smartphones, which were released in late 2021. Subsequent Pixel phones have all been powered by updated versions of the Tensor. Prior to that, Google developed the “Tensor Processing Unit” (TPU) for its data centers. The company started using them internally in data centers in 2015, announced them publicly in 2016, and made them available to third parties in 2018.

Arm-based processors are often a lower-cost and more energy-efficient option. Google’s announcement came right after Arms CEO Rene Haas issued a warning about the energy usage of AI models, according to the Wall Street Journal. He called models such as ChatGPT “insatiable” regarding their need for electricity. “The more information they gather, the smarter they are, but the more information they gather to get smarter, the more power it takes, Haas stated. By the end of the decade, AI data centers could consume as much as 20 percent to 25 percent of US power requirements. Today that’s probably four percent or less. That’s hardly very sustainable, to be honest with you.” He stressed the need for greater efficiency in order to maintain the pace of breakthroughs.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.



Source link

━ more like this

Starmer taking power will see migrants crossing the English Channel ‘as soon as possible’ – London Business News | Londonlovesbusiness.com

Migrants in Northern France are cheering Labour’s win and will be heading across the English Channel in their thousands as...

NYT Mini Crossword today: puzzle answers for Friday, July 5 | Tech Reader

Love crossword puzzles but don’t have all day to sit and solve a full-sized puzzle in your daily newspaper? That’s what The Mini...

Nike is killing the app for its futuristic Adapt BB sneakers

Nike is discontinuing its self-lacing Adapt BB sneakers and providing a case in point of what can happen to tech that relies on...

Business leaders give their demands for new Labour government – London Business News | Londonlovesbusiness.com

The country has voted for “change” and as such the Labour Party has won a landslide victory on Friday. ...

The pound continued to surge post-Labour victory – London Business News | Londonlovesbusiness.com

The pound continued its rise, strengthening above the $1.27 mark against the US dollar following the Labour Party’s decisive victory...
spot_img