Micron begins mass production of HBM3E memories
Produced in Micron’s 1β (1-beta) process 24GB 8H HBM3e Their memory will be part of Nvidia’s “H200” Tensor Core GPUs, ending SK Hynix’s presence as the sole supplier on H100 GPUs. Nvidia H200 with Micron’s new memory, 9,2GT/s data transfer rates and per GPU 1,2TB/snHas more than memory bandwidth 141GB It will have memory capacity. Compared to HBM3, this reduces memory a 44 percent increase in bandwidth means.
Micron’s memory roadmap for AI will get even stronger with the launch of the 36GB 12-Hi HBM3E in March 2024. However, competition is still very high. Samsung has already started sampling 12-stack/tier 36 GB HBM3E memory to customers.
This news our mobile application Download using
You can read it whenever you want (even offline):
Source link: https://www.donanimhaber.com/micron-nvidia-nin-yen-ai-gpu-lari-icin-hbm3e-uretimine-basladi–174658