Google announces Axion, its first Arm-based CPU for data centers

Date:

Share:


Google Cloud Next 2024 has begun, and the company is starting the event with some big announcements, including its new Axion processor. It’s Google’s first Arm-based CPU specifically created for data centers, which was designed using Arm’s Neoverse V2 CPU.

According to Google, Axion performs 30 percent better than its fastest general purpose Arm-based tools in the cloud and 50 percent better than the most recent, comparable x86-based VMs. They also claim it’s 60 percent more energy efficient than those same x86-based VMs. Google is already using Axion in services like BigTable and Google Earth Engine, expanding to more in the future.

The release of Axion could bring Google into competition with Amazon, which has led the field of Arm-based CPUs for data centers. The company’s cloud business, Amazon Web Services (AWS), released the Graviton processor back in 2018, releasing the second and third iterations over the following two years. Fellow chip developer NVIDIA released its first Arm-based CPU for data centers in 2021 named Grace, and companies like Ampere have also been making gains in the area.

Google has been developing its own processors for several years now, but they’ve been primarily focused on consumer products. The original Arm-based Tensor ship first shipped in the Pixel 6 and 6 Pro smartphones, which were released in late 2021. Subsequent Pixel phones have all been powered by updated versions of the Tensor. Prior to that, Google developed the “Tensor Processing Unit” (TPU) for its data centers. The company started using them internally in data centers in 2015, announced them publicly in 2016, and made them available to third parties in 2018.

Arm-based processors are often a lower-cost and more energy-efficient option. Google’s announcement came right after Arms CEO Rene Haas issued a warning about the energy usage of AI models, according to the Wall Street Journal. He called models such as ChatGPT “insatiable” regarding their need for electricity. “The more information they gather, the smarter they are, but the more information they gather to get smarter, the more power it takes, Haas stated. By the end of the decade, AI data centers could consume as much as 20 percent to 25 percent of US power requirements. Today that’s probably four percent or less. That’s hardly very sustainable, to be honest with you.” He stressed the need for greater efficiency in order to maintain the pace of breakthroughs.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.



Source link

━ more like this

Apple makes the M5 MacBook Pro’s battery ever so slightly easier to replace

Just like a minor upgrade in specs, Apple's latest M5 MacBook Pro gets the slightest improvement when it comes to repairability. According to...

Putin’s nuclear weapons positioned close to NATO in ‘preparation for war’ – London Business News | Londonlovesbusiness.com

Norway’s Defence Minister Tore Sandvik has warned Vladimir Putin has positioned hi nuclear fleet miles from NATO’s border in “preparation for war.” Sandvik warned...

How to cancel your Spotify subscription

Spotify recently came under fire for running recruitment ads for ICE, which ask users to "join the mission to protect America" and to...

NASA adds 3I/Atlas to an official watchlist as a ‘planetary threat’ – London Business News | Londonlovesbusiness.com

NASA has added 3I/Atlas to the International Asteroid Warning Network (IAWN) which comprises of world-wide astronomers and space agencies and have described the...

How to unpair your Apple Watch from your iPhone

If you’re moving on to a new Apple Watch, selling your current one or fixing some software hiccups, you’ll probably need to disconnect...
spot_img