Now Reading
Samsung Cuts HBM Development to One Year Amid AI Chip Race

Samsung Cuts HBM Development to One Year Amid AI Chip Race

Samsung high bandwidth memory chip

Samsung Electronics has shortened its high-bandwidth memory (HBM) development cycle from about two years to one. As a result, the company aims to align its releases with the annual schedule of AI accelerators from clients like Nvidia. According to a report citing industry sources, this shift reflects growing pressure to stay competitive in a rapidly evolving market.

“The company has established and is executing a plan to release the next-generation HBM every year to match the release rhythm of new AI accelerators from major customers like Nvidia,” a source familiar with Samsung’s internal plans told the Busan Ilbo.

Previously, the longer development cycle created disadvantages. However, as AI chipmakers adopted yearly updates, memory suppliers faced increasing urgency to adapt. Consequently, competitors such as SK Hynix gained ground through closer collaboration with Nvidia.

Vertical Integration Speeds Production

Samsung’s vertically integrated structure plays a key role in this transition. Because the company manages the full HBM production chain, it can coordinate each stage more efficiently. For instance, it produces the base logic die through its own foundry, stacks DRAM layers internally, and completes advanced packaging without outside partners.

In contrast, SK Hynix relies on TSMC for logic die fabrication. Therefore, Samsung can reduce delays that typically arise when multiple companies must coordinate. This unified process enables faster development while maintaining tighter control over timelines.

New HBM4E Push and Competitive Stakes

Samsung has already begun testing this accelerated approach. According to reports, the company plans to produce its first HBM4E sample as early as May. After fabrication of a new logic die, engineers will integrate it with DRAM and conduct validation before sending samples to Nvidia. Meanwhile, the company previously showcased an HBM4E unit at Nvidia’s GTC 2026 event, highlighting advanced speed and bandwidth capabilities.

See Also
Snap AR smart glasses on grass

At the same time, Samsung has started shipping HBM4, becoming the first supplier to deliver this sixth-generation AI memory chip. This product supports Nvidia’s Vera Rubin platform and reflects ongoing efforts to regain market share. Although Samsung held about 16 percent of the HBM market in 2025, analysts from KB Securities expect growth to around 35 percent in 2026.

Nevertheless, competition remains intense. SK Hynix continues advancing its own HBM4E technology using TSMC’s 3-nanometer process, while Micron Technology is pursuing a similar timeline. As a result, the race to dominate next-generation AI memory is becoming increasingly aggressive.

A semiconductor industry official told ChosunBiz that Samsung is “putting great effort into the next-generation market to avoid repeating the mistake of yielding the market to competitors in the previous generation”.

View Comments (0)

Leave a Reply

Your email address will not be published.

© 2024 The Technology Express. All Rights Reserved.