Samsung develops fastest DRAM chip optimzied for ondevice AI
이 글자크기로 변경됩니다.
(예시) 가장 빠른 뉴스가 있고 다양한 정보, 쌍방향 소통이 숨쉬는 다음뉴스를 만나보세요. 다음뉴스는 국내외 주요이슈와 실시간 속보, 문화생활 및 다양한 분야의 뉴스를 입체적으로 전달하고 있습니다.
Samsung Electronics, the world's top memory chip maker by revenue, has developed the industry's fastest LPDDR5X memory chip optimized for on-device artificial intelligence applications, the company said Wednesday.
The new Low Power Double Data Rate 5X manifests the highest performance of up to 10.7 gigabits-per-second and is the smallest in size among existing products leveraging the 12-nanometer-class process technology, the company said. The speed is equal to sending about 20 4-gigabyte full HD movie files within a second.
"As demand for low-power, high-performance memory increases, LPDDR DRAM is expected to expand its applications from mainly mobile to other areas that traditionally require higher performance and reliability such as PCs, accelerators, servers and automobiles," said Bae Yong-cheol, executive vice president of product planning at Samsung's memory chip chip business.
"Samsung will continue to innovate and deliver optimized products for the upcoming on-device AI era through close collaboration with customers.”
Samsung will start mass production of the 10.7Gbps LPDDR5X by the second half of the year, following verification with mobile application processors and mobile device providers, the company said.
The new product has improved in performance speed by more than 25 percent, and the capacity has increased by more than 30 percent when compared to the previous generation model.
It also expands the single package capacity of mobile DRAM up to 32 gigabytes, making it the most efficient solution for applying on-device AI, the company said.
The role of low-power, high-performance LPDDR memory chips has become more important with on-device AI, which enables direct processing of AI on devices and is increasingly crucial in the burgeoning tech industry.
Samsung said the new LPDDR5X incorporates specialized power-saving technologies, such as optimized power variation, that adjusts power according to workload, and expanded low-power mode intervals to extend the energy-saving periods.
These improvements enhance power efficiency by 25 percent over the previous generation, enabling mobile devices to provide longer battery life and allowing servers to minimize the total cost of ownership by lowering energy usage when processing data, the tech giant added.
The application of on-device AI is expected to expand into a wide range of tech sectors, including smartphones, wearable devices, robots and autonomous vehicles, leading to higher demand for high-performance, high-capacity memory chips.
According to Omdia, a market tracker, the global demand for mobile DRAM chip capacity is expected to grow from 67.6 billion gigabytes in 2023 to 125.9 billion gigabytes by 2028, showing a compound annual growth rate of 11 percent during the period.
The market tracker also forecasts the global revenue to double from $12.3 billion in 2023 to $26.3 billion in 2028.
By Jo He-rim(herim@heraldcorp.com)
Copyright © 코리아헤럴드. 무단전재 및 재배포 금지.