Micron is first to mass-produce HBM3E memory, beating Samsung and SK
이 글자크기로 변경됩니다.
(예시) 가장 빠른 뉴스가 있고 다양한 정보, 쌍방향 소통이 숨쉬는 다음뉴스를 만나보세요. 다음뉴스는 국내외 주요이슈와 실시간 속보, 문화생활 및 다양한 분야의 뉴스를 입체적으로 전달하고 있습니다.
Micron Technology, the United States' leading producer of memory chips, has begun mass-producing high bandwidth memory 3E (HBM3E) for generative AI and high-performance computing, beating dominant Korean players Samsung Electronics and SK hynix to the milestone.
The firm unexpectedly announced Monday that its HBM3E chip will be integrated into Nvidia's top-of-the-line H200 GPU, which will begin shipping in the second quarter of 2024.
The Boise, Idaho-based latecomer will become the first chipmaker to mass-produce the new HBM standard, an unanticipated feat given its modest market share in the memory chip segment.
Micron shares surged 4.02 percent on Tuesday while those of SK hynix plummeted by 4.94 percent to close at 153,800 won.
The move coincides with Samsung Electronics’ announcement of its successful development of HBM3E chips with the industry’s largest capacity of 36 gigabytes.
The Suwon, Gyeonggi-based company has already begun sending product samples to its clients, and the chips are slated for mass production by the first half of this year.
HBM3E chips stack twelve 24-gigabit dynamic random access memory (DRAM) chips with peak memory bandwidth of 1.28 terabytes per second. Both aspects have improved by 50 percent compared to its predecessor, eight-stack HBM3.
But the HBM3E chips are the same height as eight-layer ones to meet current package requirements, made possible with the application of advanced thermal compression nonconductive film (TC NCF) technology.
The chipmaker has also lowered the thickness of its NCF material, achieving the industry's smallest gap between chips at seven micrometers.
“The industry’s AI service providers are increasingly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need,” said Bae Yong-cheol, executive vice president of memory product planning at Samsung Electronics. “This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market in the AI era.”
When applied in AI services, the latest chips will be able to increase the average speed of AI training by 34 percent compared to HBM3 products while expanding the number of simultaneous users of inference services by a factor of 11.5.
Meanwhile, SK hynix is on course to deliver HBM3E chips to Nvidia in the first half of the year as planned.
BY LEE JAE-LIM [lee.jaelim@joongang.co.kr]
Copyright © 코리아중앙데일리. 무단전재 및 재배포 금지.
- Late actor Lee Sun-kyun honored at Screen Actors Guild Awards
- aespa's Karina rumored to be dating 'The Impossible Heir' actor Lee Jae-wook
- Actor Lee Jae-wook confirmed to be dating aespa's Karina
- ‘I might ruin it’: Why Son Heung-min refused to sign a Korean fan's shirt
- [단독] ‘세븐’ 아니라 ‘나인’ 이었다…현대차 12월 ‘아이오닉 9’ 출시
- SKT, Deutsche and SoftBank to form joint AI venture
- Woman dies bungee jumping at Starfield mall in Anseong
- Samsung stockpiling billions in cash for huge acquisition
- [단독] “제약 업계에선 미중 갈등 영향 없을 것” 글로벌 4위, 중국 우시바이오 CEO의 자신감
- New 'Value-up' plan to tackle 'Korea discount' lacks enforcement bite