Samsung, SK Hynix, and Micron Battle for Dominance in CXL Memory Market

With the high-bandwidth memory (HBM) market becoming more competitive, major memory manufacturers are broadening their AI semiconductor plans to incorporate Compute Express Link (CXL), a cutting-edge memory connectivity solution. This move mirrors growing requirements from large tech companies for state-of-the-art AI-driven data centers, where effectively handling huge amounts of information has grown ever more essential.

The servers operating within these data centers generally include various types of semiconductor parts such as CPUs, GPUs, and DRAM modules. The Compute Express Link (CXL) represents a cutting-edge interconnect technology aimed at enhancing data exchanges across these elements. This innovation allows for improved efficiency using less hardware, which can lead to reduced overall expenses. As noted by an industry insider: "Not only does this expand memory capabilities, but it also dramatically increases the speed of data movement between different semiconductor devices." Consequently, alongside High Bandwidth Memory (HBM), CXL stands out as one of the key technologies drawing significant attention during the current artificial intelligence revolution.

Samsung Electronics and SK Hynix from South Korea, along with the United States-based Micron Technology, which is the newest player among these key memory manufacturers, are stepping up their efforts to secure an early lead in the CXL memory market. Market analysis company Yole Intelligence forecasts that the worldwide Cxl market will expand significantly, rising from $14 million in 2023 to approximately $16 billion by 2028.

At Cxl DevCon 2025, which took place on April 29 in California, Samsung Electronics and Sk Hynix presented their most recent advancements in CXL technology. This conference, now in its second year, is organized by the CXL Consortium, an international group comprising various semiconductor firms.

During the conference, Samsung demonstrated their memory pooling technology, utilizing CXL to connect several memory modules into one unified resource pool. This allows users to dynamically manage and distribute memory assets according to demand. As a pioneer in this area, Samsung created the world’s first CXL-driven DRAM back in May 2021. By 2023, they launched a 128GB DRAM module compliant with the CXL 2.0 specification, completing client verification before the close of the year. Currently, Samsung is working towards finalizing testing for a 256GB variant. "Samsung aims to dominate the CXL marketplace to prevent making the same error as losing ground in the HBM sector," noted someone familiar with the matter within the industry.

SK hynix, which holds a competitive edge in HBM, is seeking to apply that momentum to the CXL space, focusing particularly on its high-performance DRAM capabilities. On Apr. 23, the company completed customer validation for a 96GB DDR5 DRAM module based on the CXL 2.0 standard. “When applied to servers, this module delivers 50% more capacity and 30% higher bandwidth compared to standard DDR5 modules,” a company representative said. “It’s a technology that can dramatically reduce infrastructure costs for data center operators.” SK hynix is also pursuing validation for a 128GB variant.

Micron Technology, the world’s third-largest memory chipmaker, began rolling out CXL 2.0-based memory expansion modules last year, intensifying its push to close the technological gap with Samsung and SK hynix.

The growth of CXL coincides with a larger shift in AI development—from models focused heavily on training to ones driven more by inference. Up until now, AI performance was primarily determined by how much data a model could process during the training period. During this phase, hardware configurations featuring GPUs coupled with HBM, similar to what you’d find in NVIDIA’s AI accelerators, were preferred.

Today, the emphasis has turned towards inference-driven AI models. Unlike their predecessors, these systems do more than draw from pre-existing datasets; they produce novel outputs via logical deduction even with incomplete information within the training dataset. Such tasks demand both extensive data resources and swift, effective computation—a capability that CXL architecture aims to excel at delivering. The increasing necessity for streamlined data management is fueling greater enthusiasm among the artificial intelligence community for adopting this cutting-edge memory interface technology.

Posting Komentar untuk "Samsung, SK Hynix, and Micron Battle for Dominance in CXL Memory Market"