SK Group Chairman Chey Tae-won said Wednesday that SK hynix's development speed of high bandwidth memory (HBM) has now surpassed the supply pace requested by Nvidia, signaling increased negotiating leverage with the artificial intelligence computing giant.
"Until recently, SK hynix's HBM development pace lagged behind Nvidia's requirements, so they urged us for faster progress," Chey told reporters during a press conference at the Las Vegas Convention Center on the sidelines of the ongoing CES 2025, discussing a meeting with Nvidia CEO Jensen Huang earlier in the day.
"Currently, our development pace slightly surpasses Nvidia's (requested pace)," he said. "While this could change, we are now developing at a comparable pace."
HBM, an advanced, high-performance memory chip, is a crucial component of Nvidia's graphics processing units (GPUs) that drive generative AI systems.
SK hynix is a major HBM supplier to Nvidia with its industry-leading fifth-generation HBM3E chips.
The Korean chip giant earlier announced that its planned 2025 production of HBM has been sold out, with Nvidia as the biggest customer.
"(Huang and I) discussed and confirmed the HBM schedule established at the working level," he added. "The amount of supply for this year has been determined, though I don't remember the exact numbers."
Looking beyond HBM, Chey emphasized SK Group's focus on AI data centers as a key future growth driver. This is part of a broader strategy to transform the energy-to-communications conglomerate into a high-tech AI technology firm.
"I think energy solutions for AI data centers are critical," he said, citing key energy issues, such as power supply, energy efficiency and cooling. "In this regard, the AI data center business aligns closely with SK's business portfolio."
In line with this strategy, SK Group's CES 2025 exhibition highlights its AI data center solutions and related technologies, showcasing its AI data center solutions and cutting-edge AI technologies. (Yonhap)