Settings

ⓕ font-size

  • -2
  • -1
  • 0
  • +1
  • +2

SK hynix to showcase 16-layer HBM3e chip at CES 2025

  • Facebook share button
  • Twitter share button
  • Kakao share button
  • Mail share button
  • Link share button
An artist's impression of SK hynix's exhibition at CES 2025 / Courtesy of SK hynix

An artist's impression of SK hynix's exhibition at CES 2025 / Courtesy of SK hynix

By Nam Hyun-woo

SK hynix will showcase samples of its most advanced high-bandwidth memory (HBM) chips for artificial intelligence (AI) processors at the upcoming CES 2025, with the chipmaker's top executives set to promote the company's capability as "a full-stack AI memory provider" to visitors.

According to SK hynix on Friday, CEO Kwak Noh-jung, Chief Marketing Officer Kim Ju-seon and a number of other C-level executives will attend CES 2025 slated for Jan. 7 to Jan. 10 in Las Vegas.

During the event, the company will exhibit samples of its 16-layer HBM3e product, which boasts the highest capacity — 48 gigabytes — and the highest stack configuration with 16 layers.

SK hynix revealed the latest chip development for the first time in the world in November and showed confidence that the chip's yield will match that of the 12-layer HBM3e, the most advanced model in mass production currently.

In November, SK hynix unveiled its latest chip development, marking a world-first, and expressed confidence that the chip's yield would match that of the 12-layer HBM3e, the most advanced model currently in mass production.

The 16-layer HBM3e is manufactured through an advanced mass reflow-molded underfill process, enabling the 16-layer stack while effectively controlling chip warpage and maximizing thermal performance. The company said it will be able to achieve a 20-layer stack with the process without hybrid bonding, a next-generation technology for bonding stacked chips.

Along with HBM3e, gaining attention is whether there will be a glimpse into the company's next-generation HBM4.

In a press release, Kwak said "AI-driven transformation is expected to accelerate further this year" and the company plans to "begin mass production of sixth-generation HBM (HBM4) in the second half of this year, leading the customized HBM market by meeting diverse customer demands."

HBM4 has twice as many data transfer channels as HBM3e, enabling faster data transfer speeds and higher memory capacity. It will also be customized to meet specific workloads depending on clients' demands.

SK hynix has been working on HBM4's development with the goal of finishing its design last year. As Kwak noted, the development is progressing in line with the company's plans, and CES 2025 could be a stage to drop hints to secure customers in the early stage.

SK hynix's 12-layer HBM3e / Courtesy of SK hynix

SK hynix's 12-layer HBM3e / Courtesy of SK hynix

Along with HBMs, the company will also showcase high-capacity, high-performance enterprise SSDs (eSSDs), which have a high demand due to AI data centers. Among the products on display will be the 122 terabyte D5-P5336, developed by SK hynix's subsidiary Solidigm in November last year. This SSD offers the highest storage capacity currently available, along with superior power and space efficiency.

For on-device AI services available for smartphones or laptops, SK hynix will display LPCAMM2 and ZUFS 4.0, which excel in power and space efficiency compared to existing models. Also on display will be a memory module based on Compute Express Link technology, which is a unified interface technology that connects multiple devices such as central processing units and graphics processing units and offers greater scalability.

"During the event, we plan to showcase a wide range of next-generation AI memory solutions, including flagship AI memory products such as HBM and eSSD, as well as solutions optimized for on-device AI," Kim said. "Through this, we aim to widely promote our technological competitiveness as a full-stack AI memory provider."

Nam Hyun-woo namhw@koreatimes.co.kr


X
CLOSE

Top 10 Stories

go top LETTER