Nvidia finally preferred Samsung
Nvidia has never included Samsung’s HBM memory in its GPUs developed for artificial intelligence. But now, for its H20 GPUs, Samsung’s fourth generation high bandwidth memory or HBM3 chips will use. However, it is currently unclear whether Samsung’s HBM3 chips will be used in Nvidia’s other AI accelerators.
HBM memories only SK Hynix, Micron ve Samsung produces. Although Nvidia supplies supplies from SK Hynix and Micron, the demand for productive artificial intelligence is huge and companies are having difficulty keeping up. Samsung, on the other hand, has much larger production capacities than its competitors. Additionally, SK Hynix, the leader in HBM memories, plans to shift its production to HBM3E. Therefore, the need for HBM3 will increase even more.
Samsung, the world’s largest memory chip manufacturer, has been trying to pass both Nvidia’s HBM3 and HBM3E tests since last year, but has difficulty due to heat and power consumption problems. It is valuable for Samsung that HBM3 has been approved, even for the H20. It is also stated that Samsung will start HBM3 supply in August. Meanwhile, H20 is Nvidia’s H100 solution adapted to Chinese restrictions. Theoretically, H20 is approximately 6 times slower (295 TFLOPs vs 1979 TFLOPs).
This news our mobile application Download using
You can read it whenever you want (even offline):
Source link: https://www.donanimhaber.com/nvidia-samsung-un-hbm3-belleklerini-h20-gpu-sunda-kullanacak–179822