Google announces Axion, its first Arm-based CPU for data centers

Date:

Share:


Google Cloud Next 2024 has begun, and the company is starting the event with some big announcements, including its new Axion processor. It’s Google’s first Arm-based CPU specifically created for data centers, which was designed using Arm’s Neoverse V2 CPU.

According to Google, Axion performs 30 percent better than its fastest general purpose Arm-based tools in the cloud and 50 percent better than the most recent, comparable x86-based VMs. They also claim it’s 60 percent more energy efficient than those same x86-based VMs. Google is already using Axion in services like BigTable and Google Earth Engine, expanding to more in the future.

The release of Axion could bring Google into competition with Amazon, which has led the field of Arm-based CPUs for data centers. The company’s cloud business, Amazon Web Services (AWS), released the Graviton processor back in 2018, releasing the second and third iterations over the following two years. Fellow chip developer NVIDIA released its first Arm-based CPU for data centers in 2021 named Grace, and companies like Ampere have also been making gains in the area.

Google has been developing its own processors for several years now, but they’ve been primarily focused on consumer products. The original Arm-based Tensor ship first shipped in the Pixel 6 and 6 Pro smartphones, which were released in late 2021. Subsequent Pixel phones have all been powered by updated versions of the Tensor. Prior to that, Google developed the “Tensor Processing Unit” (TPU) for its data centers. The company started using them internally in data centers in 2015, announced them publicly in 2016, and made them available to third parties in 2018.

Arm-based processors are often a lower-cost and more energy-efficient option. Google’s announcement came right after Arms CEO Rene Haas issued a warning about the energy usage of AI models, according to the Wall Street Journal. He called models such as ChatGPT “insatiable” regarding their need for electricity. “The more information they gather, the smarter they are, but the more information they gather to get smarter, the more power it takes, Haas stated. By the end of the decade, AI data centers could consume as much as 20 percent to 25 percent of US power requirements. Today that’s probably four percent or less. That’s hardly very sustainable, to be honest with you.” He stressed the need for greater efficiency in order to maintain the pace of breakthroughs.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.



Source link

━ more like this

What a Labour win means for the financial markets – London Business News | Londonlovesbusiness.com

Last Thursday the British public headed to the polling stations across the country to cast their votes for the General...

The Morning After: NASA’s year-long Mars simulation volunteers return to the real world

NASA’s Mission 1 crew — all volunteers — have left their 1700-square-foot habitat at the Johnson Space Center. Since last June 25, they’ve...

Gold retreats following Friday’s strong rally – London Business News | Londonlovesbusiness.com

After a surge last week, gold prices retreated as traders moved to secure their gains. However, gold has held...

His Galaxy Wolf Art Kept Getting Ripped Off. So He Sued—and Bought a Home

“With every one shop that I got to take down, another 10 popped up out of nowhere,” Jödicke says. “I almost wanted...

UK’s Defence Secretary escapes Russian missile attack during a visit to Ukraine – London Business News | Londonlovesbusiness.com

The newly appointed Defence Secretary John Healey visited Odesa on Sunday and the Labour Cabinet member has pledged £7.6 billion...
spot_img