Micron is first to mass-produce HBM3E memory, beating Samsung and SK

이재림 2024. 2. 27. 18:47
글자크기 설정 파란원을 좌우로 움직이시면 글자크기가 변경 됩니다.

이 글자크기로 변경됩니다.

(예시) 가장 빠른 뉴스가 있고 다양한 정보, 쌍방향 소통이 숨쉬는 다음뉴스를 만나보세요. 다음뉴스는 국내외 주요이슈와 실시간 속보, 문화생활 및 다양한 분야의 뉴스를 입체적으로 전달하고 있습니다.

Micron Technology has begun mass-producing HBM3E chips, beating Samsung and SK hynix to the milestone. The chips will be integrated into Nvidia's top-of-the-line H200 GPU, set to ship in Q2 of 2024.
Samsung Electronics' fifth-generation HBM3E chips [SAMSUNG ELECTRONICS]

Micron Technology, the United States' leading producer of memory chips, has begun mass-producing high bandwidth memory 3E (HBM3E) for generative AI and high-performance computing, beating dominant Korean players Samsung Electronics and SK hynix to the milestone.

The firm unexpectedly announced Monday that its HBM3E chip will be integrated into Nvidia's top-of-the-line H200 GPU, which will begin shipping in the second quarter of 2024.

The Boise, Idaho-based latecomer will become the first chipmaker to mass-produce the new HBM standard, an unanticipated feat given its modest market share in the memory chip segment.

Micron shares surged 4.02 percent on Tuesday while those of SK hynix plummeted by 4.94 percent to close at 153,800 won.

The move coincides with Samsung Electronics’ announcement of its successful development of HBM3E chips with the industry’s largest capacity of 36 gigabytes.

The Suwon, Gyeonggi-based company has already begun sending product samples to its clients, and the chips are slated for mass production by the first half of this year.

HBM3E chips stack twelve 24-gigabit dynamic random access memory (DRAM) chips with peak memory bandwidth of 1.28 terabytes per second. Both aspects have improved by 50 percent compared to its predecessor, eight-stack HBM3.

But the HBM3E chips are the same height as eight-layer ones to meet current package requirements, made possible with the application of advanced thermal compression nonconductive film (TC NCF) technology.

The chipmaker has also lowered the thickness of its NCF material, achieving the industry's smallest gap between chips at seven micrometers.

“The industry’s AI service providers are increasingly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need,” said Bae Yong-cheol, executive vice president of memory product planning at Samsung Electronics. “This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market in the AI era.”

When applied in AI services, the latest chips will be able to increase the average speed of AI training by 34 percent compared to HBM3 products while expanding the number of simultaneous users of inference services by a factor of 11.5.

Meanwhile, SK hynix is on course to deliver HBM3E chips to Nvidia in the first half of the year as planned.

BY LEE JAE-LIM [lee.jaelim@joongang.co.kr]

Copyright © 코리아중앙데일리. 무단전재 및 재배포 금지.

이 기사에 대해 어떻게 생각하시나요?