AI chips are emerging as game changers, and SOCAMM could create a new market leader

As artificial intelligence creates new markets across industries, it is also fueling growth for Small Outline Compression Attached Memory Modules (SOCAMM), an emerging technology often described as the “next High Bandwidth Memory.”
While not a direct replacement for HBM memory, SOCAMM, with its scalability and lower power requirements, could prove just as critical and lucrative in reshaping the global memory chip landscape.
Nvidia -- the world's top graphic processor unit provider -- has proposed a new modular memory chip standard for servers that maximizes performance while reducing power consumption, and tailored specifically for its upcoming AI processors.
While the price of SOCAMM is expected to be just 25-33 percent of HBMs, the expensive memory chip powering GPUs for AI processing, leading memory chip makers Samsung Electronics, SK hynix and Micron Technology have jumped into the race to gain an early lead in the emerging server module market.
"The SOCAMM market will be as competitive and strategically important for chipmakers as HBMs," an industry official said. "The real competition will be over who passes Nvidia's qualification tests first and becomes the main supplier of the new server module, once servers incorporating SOCAMM hit the market."
Next innovation in DRAM
SOCAMM is a compact, high-performance memory module optimized for AI servers. Unlike traditional server modules built using DDR5, SOCAMM uses LPDDR, the low power chip typically used in mobile devices. It vertically stacks multiple LPDDR5X -- the advanced model -- to boost data processing performance, while significantly improving energy efficiency.
"While DDR5 is the standard in servers, power consumption has become a major issue with large-scale deployments," an industry official explained. "SOCAMM is designed with low power in mind, and with mobile DRAM performance improving significantly, it is now viable for server use."
SOCAMM features 694 input and output pins -- tiny contact points that allow the chip to send and receive data -- significantly more than traditional DRAM modules with 262 pins, and even LPCAMM's 644. LPCAMM is the typical memory module for laptops. The higher pin count helps reduce data bottlenecks.
What also makes SOCAMM attractive is its modular design, enabling users to easily remove and replace it for updates. The chip is expected to be used together with HBM chips to enhance the performance of Nvidia's upcoming AI processors.
Nvidia is planning to use both HBM and SOCAMM chips in its next-generation AI accelerator GB300, which is planned for launch later this year. SOCAMM modules are expected to be installed in servers equipped with the Nvidia's AI accelerators.
The GPU giant is also planning to use SOCAMM in its upcoming personal AI supercomputer developed under the name Project DIGITS, according to industry sources. The device, with its compact size — small enough to fit on a desk — is designed to make high performance AI computing more accessible to the public.

Market condition
At Nvidia's developers conference this year, GTC 2025, Micron Technology was the first to announce that it has begun mass production of SOCAMM for Nvidia's GB300.
According to the American chipmaker, its SOCAMM module delivers more than 2.5 times the bandwidth of RDIMMs while using one-third the power. The compact design is also expected to enable more efficient server layout and better thermal management, the company added.
Samsung and SK hynix also showcased their SOCAMM prototypes at the GTC 2025. Samsung reported its uses 9.2 watts, delivering over 45 percent greater power efficiency compared to DDR5-based DIMMs.
The two Korean chipmakers have yet to announce their mass production plans, but said they are preparing so that they can hit the market "in line with market demand."
"Micron appears to be taking the lead in early volume production, but it would not be fore immediate server installation, but for early-stage optimization," an industry official said.
At the same time, SOCAMM is being seen as a comeback opportunity for Samsung.
After falling behind SK hynix in the AI memory race with HBM chips, Samsung lost its top spot in the DRAM market during January-March period this year. The chip giant was over taken by its smaller rival for the first time after failing to get an early lead in the AI memory sector.
According to industry sources, Samsung recently delivered SOCAMM module samples to Nvidia, in significantly larger volumes than its competitors, signaling a push to reclaim leadership in the next wave of AI memory market.
Samsung is expected to leverage its dominance in the LPDDR segment as it pushes into the SOCAMM market. According to market tracker Omdia, Samsung had a 57.9 percent share in revenue in the mobile DRAM makret in the first quarter of last year.
herim@heraldcorp.com