As artificial intelligence usage explodes, so does its environmental footprint. A recent KARE 11 feature highlights how a standard ChatGPT request consumes ten times more electricity than a typical Google search, with experts predicting that AI data centers could soon consume as much electricity as the entire country of Japan. Addressing this critical challenge, University of Minnesota researchers led by Jian-Ping Wang are developing a revolutionary solution: computational random access memory (CRAM).
The core of the problem lies in the traditional "von Neumann" computer architecture, where data must travel back and forth between storage and processing units, wasting vast amounts of energy. Wang, alongside associate professor Ulya Karpuzcu, is pioneering a method to process data directly within the memory itself. Utilizing advanced physics and magnetic tunnel junctions, this "in-memory" computing approach eliminates the need for data transfer, potentially reducing energy consumption by more than 1,000 times for certain applications.
This breakthrough technology offers a sustainable path forward for the booming AI industry. While originally a long-term research goal, the urgent market demand for energy-efficient computing has accelerated the timeline, with Wang estimating that CRAM technology could see real-world application within the next five years. Supported by industry leaders like Intel, this U of M innovation positions the University at the forefront of solving one of the tech world's most pressing sustainability crises.
Watch the full story on KARE 11 to see inside the lab and learn more about this research.