High-bandwidth memory (HBM) has become the artificial-intelligence memory of choice, and as HBM3 goes into volume production, more attention is being paid to power consumption in the wake of 2023’s generative AI boom.
The increasing demand for memory bandwidth from AI is directly correlated to increasing HBM bandwidth.Performance needs, memory bandwidth and memory sizes are growing exponentially, putting higher expectations and pressure on the next generation of HBM.
While bandwidth per watt as it relates to HBM is not particularly new, he said, energy consumption by data centers has been on the rise. These rapidly increasing power costs mean that bandwidth per watt is becoming a more important metric for enterprises who need to monitor operational costs—even more so with the increasing focus on sustainability initiatives.
The high costs associated with HBM and the price tag of the memory itself means the total cost of ownership becomes the deciding factor when determining if this uber-power memory is necessary for application. The process for customers to decide which memory they need starts with technical requirements like density, performance and power.
Read my full story for EE Times.
Gary Hilson is a freelance writer with a focus on B2B technology, including information technology, cybersecurity, and semiconductors.