Google announces Axion, its first Arm-based CPU for data centers

Date:

Share:


Google Cloud Next 2024 has begun, and the company is starting the event with some big announcements, including its new Axion processor. It’s Google’s first Arm-based CPU specifically created for data centers, which was designed using Arm’s Neoverse V2 CPU.

According to Google, Axion performs 30 percent better than its fastest general purpose Arm-based tools in the cloud and 50 percent better than the most recent, comparable x86-based VMs. They also claim it’s 60 percent more energy efficient than those same x86-based VMs. Google is already using Axion in services like BigTable and Google Earth Engine, expanding to more in the future.

The release of Axion could bring Google into competition with Amazon, which has led the field of Arm-based CPUs for data centers. The company’s cloud business, Amazon Web Services (AWS), released the Graviton processor back in 2018, releasing the second and third iterations over the following two years. Fellow chip developer NVIDIA released its first Arm-based CPU for data centers in 2021 named Grace, and companies like Ampere have also been making gains in the area.

Google has been developing its own processors for several years now, but they’ve been primarily focused on consumer products. The original Arm-based Tensor ship first shipped in the Pixel 6 and 6 Pro smartphones, which were released in late 2021. Subsequent Pixel phones have all been powered by updated versions of the Tensor. Prior to that, Google developed the “Tensor Processing Unit” (TPU) for its data centers. The company started using them internally in data centers in 2015, announced them publicly in 2016, and made them available to third parties in 2018.

Arm-based processors are often a lower-cost and more energy-efficient option. Google’s announcement came right after Arms CEO Rene Haas issued a warning about the energy usage of AI models, according to the Wall Street Journal. He called models such as ChatGPT “insatiable” regarding their need for electricity. “The more information they gather, the smarter they are, but the more information they gather to get smarter, the more power it takes, Haas stated. By the end of the decade, AI data centers could consume as much as 20 percent to 25 percent of US power requirements. Today that’s probably four percent or less. That’s hardly very sustainable, to be honest with you.” He stressed the need for greater efficiency in order to maintain the pace of breakthroughs.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.



Source link

━ more like this

Superhero workplace comedy, more powerwashing and other new indie games worth checking out

Welcome to our latest roundup of what's going on in the indie game space. It's been a packed week with lots of tasty...

Here’s our first look at the Paranormal Activity game from the maker of The Mortuary Assistant

A teaser shared at the end of the Indie Horror Showcase this week gives us a better idea of what the upcoming found...

Dodgers vs. Blue Jays, Game 2 tonight: How to watch the 2025 MLB World Series without cable

The League Championship Series are history, and the final two teams have emerged: The 2025 Fall Classic will see the Los Angeles Dodgers...

Blumhouse is adapting Something is Killing the Children for a live-action film and animated series

The hit horror comic series Something is Killing the Children is headed to the big (and small) screen. According to The Hollywood Reporter,...

Relive the Commodore 64’s glory days with a slimmer, blacked-out remake

The Commodore 64 is back in black, sort of. Retro Games and Plaion Replai released a limited edition redesign of the best-selling computer,...
spot_img