SK hynix begins HBM3E verification with Nvidia

진은수 2023. 8. 21. 17:57
자동요약 기사 제목과 주요 문장을 기반으로 자동요약한 결과입니다.
전체 맥락을 이해하기 위해서는 본문 보기를 권장합니다.

"The latest HBM3E qualifies not only for speed, which is a prerequisite for AI memory chips but also for controlling heat and user convenience."

"We look forward to continuing our collaboration with HBM3E to deliver the next generation of AI computing."

글자크기 설정 파란원을 좌우로 움직이시면 글자크기가 변경 됩니다.

이 글자크기로 변경됩니다.

(예시) 가장 빠른 뉴스가 있고 다양한 정보, 쌍방향 소통이 숨쉬는 다음뉴스를 만나보세요. 다음뉴스는 국내외 주요이슈와 실시간 속보, 문화생활 및 다양한 분야의 뉴스를 입체적으로 전달하고 있습니다.

SK hynix has started the verification process for its high-performance HBM3E memory chip with Nvidia, the world’s top AI chip designer.
SK hynix' HBM3E memory chip [SK HYNIX]

SK hynix has started the verification process for its high-performance HBM3E memory chip with Nvidia, the world’s top AI chip designer.

The Korean memory chipmaker said Monday it started shipping out samples of its HBM3E, an extended version of the HBM3, to its clients including Nvidia.

HBM, short for high bandwidth memory is a type of high-performance memory chip that stacks dynamic random-access memory (DRAM) dies for a quicker and more energy-efficient data process, which is increasingly becoming essential as demand for AI surges.

"On the back of our experience in supplying the largest amount of HBM and maturity in mass production, [SK hynix] aims to begin mass producing HBM3E from the first half of next year and procure an unrivaled position in the memory chip industry for the AI," the Korean chipmaker said in a release Monday.

"The latest HBM3E qualifies not only for speed, which is a prerequisite for AI memory chips but also for controlling heat and user convenience."

The HBM3E can process at least 1.15 terabytes of data per second, according to the company, which means it can download 230 Full-HD films in one second.

The latest version of the high-performance chip has been deployed with the so-called MR-MUF (Mass Reflow Molded Underfill) which has improved the heat dissipation by 10 percent compared to the previous version.

It is compatible with the previous HBM system without needing additional adjustments, the company said.

SK hynix's partnership with Nvidia continues in the HBM segment.

“We have a long history of working with SK hynix on High Bandwidth Memory for leading-edge accelerated computing solutions,” Nvidia Vice President of Hyperscale and HPC Computing Ian Buck said.

“We look forward to continuing our collaboration with HBM3E to deliver the next generation of AI computing.”

SK hynix is a leader in the high-end DRAM market, accounting for 53 percent of the market share according to TrendForce, followed by Samsung Electronics' 38 percent and Micron's 9 percent.

The Korean memory chipmaker is said to be the sole supplier of HBM chips to Nvidia's flagship H100 AI chipset and is not willing to give up its share to rivals, namely Samsung Electronics.

During its second-quarter conference call, SK hynix said it will prioritize HBM investment next year and increase its production by twofold.

BY JIN EUN-SOO [jin.eunsoo@joongang.co.kr]

Copyright © 코리아중앙데일리. 무단전재 및 재배포 금지.

이 기사에 대해 어떻게 생각하시나요?