Micron begins volume production of new chip for AI workloads


San Francisco, Feb 26 (IANS) Semiconductor leader Micron Technology on Monday announced it has begun volume production of its HBM3E (high bandwidth memory 3E) solution, that will help reduce data centre operating costs by consuming about 30 per cent less power than its rivals.

The 24GB 8H HBM3E will be part of Nvidia H200 Tensor Core GPUs, which will begin shipping in the second calendar quarter of 2024, the US-based company said in a statement.

“AI workloads are heavily reliant on memory bandwidth and capacity, and Micron is very well-positioned to support the significant AI growth ahead through our industry-leading HBM3E and HBM4 roadmap, as well as our full portfolio of DRAM and NAND solutions for AI applications,” said Sumit Sadana, EVP and chief business officer at Micron Technology.

With pin speed greater than 9.2 gigabits per second (Gb/s), the new Micron solution delivers more than 1.2 terabytes per second (TB/s) of memory bandwidth, enabling lightning-fast data access for AI accelerators, supercomputers, and data centres, the company said.

The solution leads the industry with 30 per cent lower power consumption compared to competitive offerings.

Micron is also extending its leadership with the sampling of 36GB 12-High HBM3E, which is set to deliver greater than 1.2 TB/s performance and superior energy efficiency compared to competitive solutions, in March this year.

–IANS

na/


Back to top button