Chip industry defies gloomy forecasts with AI chips, HBM boom
이 글자크기로 변경됩니다.
(예시) 가장 빠른 뉴스가 있고 다양한 정보, 쌍방향 소통이 숨쉬는 다음뉴스를 만나보세요. 다음뉴스는 국내외 주요이슈와 실시간 속보, 문화생활 및 다양한 분야의 뉴스를 입체적으로 전달하고 있습니다.
SK hynix Inc. announced on Thursday that it has started mass production of the 5th-generation HBM3E 12-layer memory, which will be used in the latest AI chips. Samsung Electronics Co., the first to develop this 12-layer memory, is also preparing for mass production by the end of 2024 and Nvidia Corp., a key player in AI chips, is set to begin mass production of its next-generation AI-specific chip, Blackwell, in the fourth quarter of the year. Micron Technology Inc. surprised the market on the same day with strong earnings, revealing that all its HBM products slated for production in 2024 and 2025 are already sold out.
Industry insiders are predicting a boom rather than a downturn. SK hynix’s new HBM3E 12-layer product, whose HBM capacity of 36GB is the highest currently available, is expected to be supplied to Nvidia later in 2024. SK hynix has completed Nvidia’s quality tests ahead of Samsung Electronics and Micron, which are still developing their HBM3E 12-layer prototypes.
Mainstream AI chips currently use 4th-generation HBM3 and 5th-generation HBM3E 8-layer products, but the latest AI chips are expected to soon adopt the HBM3E 12-layer product as the standard. Nvidia, which dominates the global AI accelerator market, is reportedly planning to equip its Blackwell Ultra AI-specific chip with eight stacks of HBM3E 12-layer memory, which is set to be released in the first half of 2025. HBM dramatically improves data processing speed by vertically stacking multiple DRAMs, with the HBM3E 12-layer product increasing capacity by 50 percent compared to the 8-layer product by stacking up to 12 DRAMs of 3GB each while maintaining the same thickness.
Samsung Electronics is also accelerating its development of high-value HBM technology and expanding its production and is expected to supply its HBM3E 12-layer product in the second half of 2024. As HBM is a custom-made product designed to meet clients’ performance and volume needs, it is less vulnerable to oversupply risks.
Micron’s earnings surprise was driven by increased sales of DRAM and HBM for AI data center servers. The company reported fiscal fourth-quarter sales of $7.75 billion, a 93 percent increase from the same period during the previous year.
Copyright © 매일경제 & mk.co.kr. 무단 전재, 재배포 및 AI학습 이용 금지
- “정환아 대한민국이 날 버렸어”…홍명보 한탄에 안정환의 빵터진 한마디 - 매일경제
- 이유 물어봐도 웃기만 하더니…방송 중 서장훈 째려보는 여성의 정체 - 매일경제
- “개인사로 피해 안 갔으면”...장동건, 사생활 논란 후 복귀 심경 - 매일경제
- 오늘의 운세 2024년 9월 27일 金(음력 8월 25일) - 매일경제
- 정치권 ‘나혼자산다’ 또 저격?…尹 “방송서 홀로 사는게 복인 것처럼 한다” - 매일경제
- “오타니 50 홈런볼 내 손 비틀어 뺏었다”…소송 제기한 10대, 영상 보니 - 매일경제
- “결혼 전 대시받은 적 있냐” 질문에…배우 한가인의 깜짝 놀랄 답변 - 매일경제
- 캐디백 직접 멘 이준이 “출전 연락에 심장 쿵쾅쿵쾅…하늘이 준 기회 놓치지 않겠다” - 매일경
- “야 이 XX아, 어디서 입을 놀려”…‘주차 시비’ 여성 무차별 폭행남의 최후 - 매일경제
- 오타니 50-50 완성한 홈런공, 경매 시장 나온다 - MK스포츠