Chip industry defies gloomy forecasts with AI chips, HBM boom

2024. 9. 27. 15:15
글자크기 설정 파란원을 좌우로 움직이시면 글자크기가 변경 됩니다.

이 글자크기로 변경됩니다.

(예시) 가장 빠른 뉴스가 있고 다양한 정보, 쌍방향 소통이 숨쉬는 다음뉴스를 만나보세요. 다음뉴스는 국내외 주요이슈와 실시간 속보, 문화생활 및 다양한 분야의 뉴스를 입체적으로 전달하고 있습니다.

SK hynix is showcasing its HBM3E 12-layer product alongside NVIDIA’s H200 at the OIP forum hosted by Taiwan’s TSMC. On the 26th, SK hynix announced its plan to supply NVIDIA with the world‘s first mass-produced HBM3E 12-layer product. (SK hynix)
The global semiconductor industry is thriving, thanks to the intensifying competition for next-generation AI technology alongside the sustained demand for high-bandwidth memory (HBM). This is a stark contrast to Morgan Stanley‘s recent gloomy forecast for the sector.

SK hynix Inc. announced on Thursday that it has started mass production of the 5th-generation HBM3E 12-layer memory, which will be used in the latest AI chips. Samsung Electronics Co., the first to develop this 12-layer memory, is also preparing for mass production by the end of 2024 and Nvidia Corp., a key player in AI chips, is set to begin mass production of its next-generation AI-specific chip, Blackwell, in the fourth quarter of the year. Micron Technology Inc. surprised the market on the same day with strong earnings, revealing that all its HBM products slated for production in 2024 and 2025 are already sold out.

Industry insiders are predicting a boom rather than a downturn. SK hynix’s new HBM3E 12-layer product, whose HBM capacity of 36GB is the highest currently available, is expected to be supplied to Nvidia later in 2024. SK hynix has completed Nvidia’s quality tests ahead of Samsung Electronics and Micron, which are still developing their HBM3E 12-layer prototypes.

Mainstream AI chips currently use 4th-generation HBM3 and 5th-generation HBM3E 8-layer products, but the latest AI chips are expected to soon adopt the HBM3E 12-layer product as the standard. Nvidia, which dominates the global AI accelerator market, is reportedly planning to equip its Blackwell Ultra AI-specific chip with eight stacks of HBM3E 12-layer memory, which is set to be released in the first half of 2025. HBM dramatically improves data processing speed by vertically stacking multiple DRAMs, with the HBM3E 12-layer product increasing capacity by 50 percent compared to the 8-layer product by stacking up to 12 DRAMs of 3GB each while maintaining the same thickness.

Samsung Electronics is also accelerating its development of high-value HBM technology and expanding its production and is expected to supply its HBM3E 12-layer product in the second half of 2024. As HBM is a custom-made product designed to meet clients’ performance and volume needs, it is less vulnerable to oversupply risks.

Micron’s earnings surprise was driven by increased sales of DRAM and HBM for AI data center servers. The company reported fiscal fourth-quarter sales of $7.75 billion, a 93 percent increase from the same period during the previous year.

Copyright © 매일경제 & mk.co.kr. 무단 전재, 재배포 및 AI학습 이용 금지

이 기사에 대해 어떻게 생각하시나요?