Researchers just solved AI’s biggest conundrum | Tech Reader

Date:

Share:


Harth / Amazon

The large language models that power today’s chatbots like ChatGPT, Gemini, and Claude are immensely powerful generative AI systems, and immensely power-hungry ones to boot.

They apparently don’t need to be, as recent research out of University of California, Santa Cruz has shown that modern LLMs running billions of parameters can operate on just 13 watts of power without a loss in performance. That’s roughly the draw of a 100W light bulb, and a 50x improvement over the 700W that an Nvidia H100 GPU consumes.

“We got the same performance at way less cost — all we had to do was fundamentally change how neural networks work,” lead author of the paper, Jason Eshraghian, said. “Then we took it a step further and built custom hardware.” They did so by doing away with the neural network’s multiplication matrix.

Matrix multiplication is a cornerstone of the algorithms that power today’s LLMs. Words are represented as numbers and then organized into matrices where they are weighted and multiplied against one another to produce language outputs depending on the importance of certain words and their relationship to other words in the sentence or paragraph.

These matrices are stored on hundreds of physically separate GPUs and fetched with each new query or operation. The process of shuttling data that needs to be multiplied among the multitude of matrices costs a significant amount of electrical power, and therefore money.

To get around that issue, the UC Santa Cruz team forced the numbers within the matrices into a ternary state — every single number carried a value of either negative one, zero, or positive one. This allows the processors to simply sum the numbers instead of multiplying them, a tweak that makes no difference to the algorithm but saves a huge amount of cost in terms of hardware. To maintain performance despite the reduction in the number of operations, the team introduced time-based computation to the system, effectively creating a “memory” for the network, increasing the speed at which it could process the diminished operations.

“From a circuit designer standpoint, you don’t need the overhead of multiplication, which carries a whole heap of cost,” Eshraghian said. And while the team did implement its new network on custom FGPA hardware, they remain confident that many of the efficiency improvements can be retrofitted to existing models using open-source software and minor hardware tweaks. Even on standard GPUs, the team saw a 10 times reduction in memory consumption while improving operational speed by 25%.

With chip manufacturers like Nvidia and AMD continually pushing the boundaries of GPU processor performance, electrical demands (and their associated financial costs) for the data centers housing these systems have soared in recent years. With the increase in computing power comes a commensurate increase in the amount of waste heat the chips produce — waste heat that now requires resource-intensive liquid cooling systems to fully dissipate.

Arm CEO Rene Haas warned The Register in April that AI data centers could consume as much as 20-25% of the entire U.S. electrical output by the end of the decade if corrective measures are not taken, and quickly.

Editors’ Recommendations








Source link

━ more like this

This insane AWOL short-throw projector is $1,000 off today!

Want to take your movie nights to the next level? While the majority of your friends and family are stuck with TVs, you’ll...

Threads’ take on Bluesky Starter Packs is live

Threads is rolling out its take on the Starter Packs concept from Bluesky. Instagram CEO Adam Mosseri's about the update explains that...

A new test shows Microsoft Recall’s continued security problems

Microsoft is currently previewing its latest version of Recall to Windows Insiders on Snapdragon-, Intel-, and AMD-based Copilot+ PCs — and the topic...

Apple Wallet digital IDs are now available in 10 areas. Here are the newest ones

Apple iPhone users in two additional locations in the U.S. can now access digital driver’s licenses and state IDs through the Apple Wallet...

Epic Games’ app store will be preinstalled on millions of Android phones

Epic Games has struck a deal with Telefónica to have its mobile storefront pre-installed on millions of compatible Android devices. As such, those...
spot_img