SK hynix Unveils 16-Layer HBM3E Chip Development, Aiming for AI Market Leadership.


SEOUL: SK hynix Inc. CEO Kwak Noh-jung announced the company’s plan to produce the industry’s first 48-gigabyte, 16-layer high-bandwidth memory (HBM) chips early next year. This development is part of SK hynix’s strategy to strengthen its position in the artificial intelligence (AI) chip market.

According to Yonhap News Agency, this marks the first official announcement from SK hynix about the development of 16-layer HBM3E chips. HBM is a high-performance DRAM that is in high demand, especially by U.S. AI chip giant Nvidia Corp. for its graphics processing units, which are essential for AI computing. The new 16-high products are expected to enhance performance by 18 percent in training and 32 percent in inference compared to the 12-layer versions.

SK hynix started supplying eight-layer HBM3E to Nvidia in March and began mass-producing the latest 12-layer HBM3E products in September. The company anticipates the shipment of 16-layer HBM3E products in the first half of next year and plans to introduce next-gen
eration HBM4 chips in the second half. Kwak highlighted that SK hynix intends to utilize the mass reflow-molded underfill (MR-MUF) process, previously used for their 12-layer products, in manufacturing the new 16-layer HBM3E chips. This packaging technology was first implemented by the South Korean chipmaker with HBM2E in 2019.

Kwak also mentioned that from the HBM4 generation, SK hynix plans to collaborate with a top global logic foundry to adopt logic process on base die, thereby offering customers the best products. This collaboration refers to Taiwan Semiconductor Manufacturing Co., commonly known as TSMC. Recent market reports indicate that SK hynix led the global HBM market last year with a 53 percent market share, ahead of Samsung Electronics Co. at 38 percent and Micron Technology Inc. at 9 percent.