Samsung, SK hynix jostle to lead AI memory at Semicon Taiwan

2024. 9. 5. 10:51
자동요약 기사 제목과 주요 문장을 기반으로 자동요약한 결과입니다.
전체 맥락을 이해하기 위해서는 본문 보기를 권장합니다.

"Combining memory processing with logic technology is essential to improving HBM performance," he noted. "We now offer customized logic integration in HBM, allowing clients to meet specific requirements via turnkey solutions and intellectual property offerings."

"To reach artificial general intelligence (AGI), we need to address the challenges of power consumption, heat dissipation, and memory bandwidth," Kim said. "SK hynix is developing AI memory that minimizes power use and heat generation while maintaining high capacity and performance."

글자크기 설정 파란원을 좌우로 움직이시면 글자크기가 변경 됩니다.

이 글자크기로 변경됩니다.

(예시) 가장 빠른 뉴스가 있고 다양한 정보, 쌍방향 소통이 숨쉬는 다음뉴스를 만나보세요. 다음뉴스는 국내외 주요이슈와 실시간 속보, 문화생활 및 다양한 분야의 뉴스를 입체적으로 전달하고 있습니다.

At the ‘Semicon Taiwan 2024’ opening on the 4th, Samsung’s Lee Jung-bae and SK hynix’s Kim Ju-seon gave keynote speeches. [Photo by Yonhap]
South Korean chipmakers Samsung Electronics Co. and SK hynix Inc. competed for leadership in artificial intelligence (AI) memory technology by showcasing their respective innovative technologies at Semicon Taiwan 2024 in Taipei on Wednesday.

Lee Jung-bae, president of Samsung’s memory business, emphasized in a keynote speech at the event that Samsung is developing solutions specialized for on-device AI that offers high performance and low power consumption, while highlighting the challenges posed by the growing power consumption in the AI era that has led to limitations in AI memory performance and capacity.

“Relying solely on memory processing has limitations in improving high-bandwidth memory (HBM) performance, and Samsung is in a very strong position as it also owns its own foundry and system LSI business unit,” he asserted.

Lee also underlined Samsung’s strength as a comprehensive semiconductor company that provides “turnkey” solutions.

“Combining memory processing with logic technology is essential to improving HBM performance,” he noted. “We now offer customized logic integration in HBM, allowing clients to meet specific requirements via turnkey solutions and intellectual property offerings.”

Samsung also unveiled its roadmap for double data rate (DDR) memory, announcing plans to introduce sub-10-nanometer nodes by 2027, which is ahead of industry expectations that projected that these advancements would happen by 2030. The company specifically plans to launch 10nm-class 1d products in 2026 and single-digit nanometer (0a) products in 2027.

For its part, SK hynix emphasized its strategic partnership with Taiwan Semiconductor Manufacturing Co. (TSMC), and Nvidia Corp.

AI Infrastructure President Kim Ju-seon revealed in his keynote speech that his company will start mass producing the 12-layer HBM3E from the end of September 2024 and is also preparing to introduce HBM4, which will incorporate logic technology, in collaboration with TSMC.

“To reach artificial general intelligence (AGI), we need to address the challenges of power consumption, heat dissipation, and memory bandwidth,” Kim said. “SK hynix is developing AI memory that minimizes power use and heat generation while maintaining high capacity and performance.”

He also highlighted SK hynix’s efforts to supply products such as dual-in-line memory modules (DIMMs), quad-level cell (QLC)-based enterprise solid-state drives (eSSDs), and low-power DRAM (LPDDR5T) to the market.

“SK hynix is the only company mass-producing QLC-based eSSDs, and we plan to introduce a 120TB model moving forward,” he said.

This is the first time that the heads of both Samsung Electronics and SK hynix have shared the stage as keynote speakers at Semicon Taiwan, underscoring the intense competition between the two companies in the AI memory sector.

Copyright © 매일경제 & mk.co.kr. 무단 전재, 재배포 및 AI학습 이용 금지

이 기사에 대해 어떻게 생각하시나요?