Samsung's HBM-PIM memory chip, a high-bandwidth memory chip integrated with AI processing power / Courtesy of Samsung Electronics |
By Baek Byung-yeul
Samsung Electronics will begin mass-producing high-bandwidth memory (HBM) chips for the booming artificial intelligence (AI) market in the second half of this year to catch up with SK hynix, the market leader in the nascent AI memory chip market, according to industry analysts, Monday.
As the market for generative AI services continues to grow, HBM chips used for AI servers are gaining traction in the memory chip industry, which has been struggling with falling demand.
By vertically connecting multiple DRAMs, HBM has superior performance, in terms of data processing speeds, over conventional DRAM. It is, however, more expensive at around two to three times the price of a standard DRAM.
Industry analysts see the products will be increasingly used because AI services must be supported by high-performance and high-capacity DRAM to be implemented properly.
SK hynix is currently leading the HBM market. According to market tracker TrendForce, SK had about a 50 percent market share in 2022, with Samsung accounting for 40 percent and Micron accounting for 10 percent. The HBM market is still in its infancy, accounting for about 1 percent of the entire DRAM market.
The HBM market is expected to grow at an annual growth rate of up to 45 percent from this year to 2025, TrendForce said. With the AI era in full swing, the demand for HBM products could increase dramatically.
To catch up with the league leader, Samsung is set to mass-produce HBM3 memory chips with 16-gigabyte capacity and 24-gigabyte capacity. Those products are known to have data processing speeds of 6.4 Gigabits per second (Gbps), the fastest in the market, which helps increase the learning calculation speed of the server.
"We plan to launch the next generation of HBM3P products in the second half of the year with the higher performance and capacity the market demands," Kim Jae-joon, executive vice president of Samsung, said in April's conference call.
In addition to HBM, Samsung continues to introduce new memory solutions such as HBM-PIM, a high-bandwidth memory chip integrated with AI processing power, and CXL DRAM, which can overcome the limitations of DRAM capacity.
Kim Dong-won, an analyst at KB Securities said that as Samsung introduces new products to the HBM market, it can expect to boost its profitability against the sluggish memory chip market.
"Starting in the fourth quarter of this year, Samsung is expected to begin supplying HBM3 to GPU makers in North America. As a result, the share of HBM3 sales in Samsung's total DRAM sales is expected to expand from 6 percent in 2023 to 18 percent in 2024," the analyst said.