why the next big leap in a.i. isn't new math, but energy savings

We keep seeing artificial intelligence do more and more impressive things. But the most impressive thing it can do next is to fit in our hands.

computer chip illustration

While pundits and tech writers often talk about how we carry AI assistants and the knowledge of the entire world in our pockets, that's not really true. We carry devices that connect to vast server farms in which the AI models and libraries with which we interact live. And unless you're willing to spend tens of thousands of dollars and are ready to lug a server lag with a generator everywhere you go, that's where they'll remain in the near future. But a decade or two from now, this may change, and we'll see artificial intelligence do amazing things.

But wait a minute, how is that possible? Don't some of the most popular neural networks use billions of parameters and churn through gigabytes of data? Wouldn't that consume massive amounts of energy? Isn't that why NASA uses AI primarily by beaming down images from a destination, running the models on Earth, then sending back the outputs if necessary? Yes, all of the above is true, which is why engineers have been working on NeuRRAM, an architecture for a computer chip optimized to execute the kind of operations used by AI.

To be perfectly fair, NeuRRAM is not the first technology to try and bring AI to a specialized computer chip, and there are other architectures known as FPGAs trying to achieve the same results. But where the memory metals in FPGAs are highly limited in how complex they can be and how many AI models they can run, NeuRAMM is more efficient, achieving results in line with Intel's flagship neuromorphic Loihi chips using half the energy in the process, which may not sound like a big deal until you consider the full implications of this technology.

Humming alongside your typical cell phone CPU, it could run calculations that would otherwise require an internet connection to execute right on the device. It may drain your battery a bit more at first, but it would have significant benefits when it comes to privacy and performance. Network lag wouldn't be an issue, and none of your data needs to be exposed to the world at large to run the computations. Reduce the energy requirements enough and robots on Earth and in space can take advantage of more powerful and sensitive models than ever before.

At this moment, there's no date on which you can expect NeuRRAM chips added to your phone or computer. Its designers are still trying to figure out how well it can scale for mass production and want to drive its energy consumption down even further. But with demand for something very much like them growing every day from researchers, scientists, and consumer electronics companies, and it seems like it's just a matter of time before NeuRRAM chips make their way into everyday devices, factory robots, and space probes.

See: Wan, W., et al. (2022) A compute-in-memory chip based on resistive random-access memory. Nature 608, 504–512 DOI: 10.1038/s41586-022-04992-8

  archived from wowt
              
# tech // artificial intelligence / computer chips / computer science


  show comments
latest reads

the xenonite plot armor of project hail mary

Hail Mary was a badly mismanaged, rushed death trap driven by groupthink and politics, and Ryland Grace was right to balk at the idea.
the xenonite plot armor of project hail mary

how ai can love bomb you into being an asshole

In ads, chatbots are omniscient arbiters and truth brokers. In practice, they're sycophantic enablers according to the latest research.
how ai can love bomb you into being an asshole

why we're all getting meaner and meaner online

Yes, being a professional asshole is now a viable career option. Which is awful news for online discourse.
why we're all getting meaner and meaner online

how and why corporate jargon and technobabble lull the mind

Yes, sadly, some of the worst stereotypes about corporate culture really are true.
how and why corporate jargon and technobabble lull the mind

the great theoretical chatbot job apocalypse

According to Anthropic, LLMs can obliterate most white collar jobs. Well, theoretically...
the great theoretical chatbot job apocalypse

i prompt, therefore i am: how tech forgot about human agency

Tone deaf tech bros no longer seem to understand that their pitch for AI is fundamentally dystopian and dismissive.
i prompt, therefore i am: how tech forgot about human agency