SK chief vows to tackle AI bottleneck along with Nvidia, TSMC
전체 맥락을 이해하기 위해서는 본문 보기를 권장합니다.
"We wish that we got more bandwidth with lower energy. So the roadmap that SK Hynix is on is super aggressive and is super necessary."
"High-bandwidth, low-power memory solutions have transformed the AI workload, pushing the boundaries of what AI systems can achieve."
이 글자크기로 변경됩니다.
(예시) 가장 빠른 뉴스가 있고 다양한 정보, 쌍방향 소통이 숨쉬는 다음뉴스를 만나보세요. 다음뉴스는 국내외 주요이슈와 실시간 속보, 문화생활 및 다양한 분야의 뉴스를 입체적으로 전달하고 있습니다.
SK Group Chairman Chey Tae-won renewed his commitment to building a robust artificial intelligence ecosystem and vowed to overcome bottlenecks in the process with AI pioneers and key partners Nvidia and TSMC.
Speaking at SK AI Summit 2024, held in Seoul on Monday, Chey listed a number of difficulties the burgeoning AI industry faces and reaffirmed his ties with tech leaders, including Nvidia CEO Jensen Huang, TSMC CEO C.C. Wei and Microsoft CEO Satya Nadella. In prerecorded videos, the CEOs delivered their congratulatory messages, emphasizing the role of SK hynix in their AI endeavors.
“The last time I met with Huang, he asked me if we could supply HBM4 six months earlier than the date we have agreed upon,” Chey said, opening the two-day event in Seoul. “I asked the SK hynix CEO whether it's possible, and he said he will try, so we are working to move the date up by six months."
SK hynix, the chipmaking affiliate of SK Group, is dominating the global market for High Bandwidth Memory chips, the cutting-edge component used to boost the performance of graphic processing units in AI applications. Being the major HBM supplier to Nvidia, the world's top GPU maker, SK hynix achieved a record quarterly operating profit of 7.03 trillion won ($5.13 billion) in the July-September period this year.
In a prerecorded video message, Nvidia CEO Jensen Huang highlighted the importance of SK hynix' HBM chips, calling it the enabler of "super Moore's law." Moore's law states how the number of transistors on a single chip doubles every two years and is often used to describe the quick pace of advances made in the chip world.
“When we moved from coding to machine learning, it changed the computer architecture fairly profoundly. And the work that we did with HBM memories has really made it possible for us to achieve what appears to be super Moore’s law,” Huang said in the prerecorded interview conducted by David Patterson, a renowned computer scientist.
“We wish that we got more bandwidth with lower energy. So the roadmap that SK Hynix is on is super aggressive and is super necessary.”
To strengthen ties with the global tech giants on the AI front, Chey established a task force team dedicated to AI within the SK Group.
Chey then emphasized the conglomerate's collaboration with TSMC, the world's largest foundry producing Nvidia's cutting-edge GPUs. In response, TSMC CEO C.C. Wei delivered his congratulatory remark via video.
"SK hynix has been at the forefront of delivering cutting-edge HBM technologies, and its dedication to innovation has significantly contributed to shaping the future of AI," C.C. Wei said.
"High-bandwidth, low-power memory solutions have transformed the AI workload, pushing the boundaries of what AI systems can achieve."
Describing TSMC as a company that genuinely cares about its customers, Chey recalled his conversation with Morris Chang, the TSMC founder, where Chang supported his decision to acquire Hynix. SK Group acquired Hynix, a memory chip firm in 2012.
“I have known Morris Chang for 20 years. When I decided to take over Hynix, I had the chance to speak with him. He was very supportive, encouraging me by saying the chip industry has a bright future,” Chey said. “He welcomed me into the industry, and has stayed in touch as a valuable partner.”
At the conference, the SK Group chairman identified several bottlenecks that need to be addressed for AI to continue its growth: the lack of "killer use cases" and revenue models to recover AI investments; a shortage of AI accelerators and semiconductor supply; limited capacity in advanced manufacturing facilities; challenges in securing sufficient energy for AI infrastructure; and the need for high-quality data.
Emphasizing the massive amount of energy required to support the burgeoning AI applications, Chey also addressed the need to secure energy sources and explained SK's investment in gas turbines with carbon-capturing technology and small modular reactors as an independent power source.
In a separate session, SK hynix CEO Kwak Noh-jung confirmed the chipmaker is preparing to mass produce the industry’s most up-to-date HBM chip, a 16-stack HBM3E with a capacity of 48 gigabytes.
In a simulation, SK hynix found that its envisioned 16-stack HBM3E chip would enhance the performance of AI training by 18 percent and AI inferencing capability by 32 percent, when compared to the existing 12-stack model, Kwak said.
SK hynix has been outpacing its crosstown rival Samsung Electronics in the AI chip market with its HBM chips.
When asked about SK hynix possibly beating Samsung in yearly earnings for the first time this year, the SK Group chairman cautiously noted how companies have different approaches to AI chips, and that Samsung will "do better" once it rides on the AI wave.
"Samsung has much more in terms of technologies and resources. I am sure the company will be able to make much better results (than now) as it rides on the wave of AI," Chey told reporters after his opening speech.
By Jo He-rim(herim@heraldcorp.com)
Copyright © 코리아헤럴드. 무단전재 및 재배포 금지.