New Micron SOCAMM2 module targets greener AI data centres

The 192GB SOCAMM2 is based on Micron’s latest 1-gamma DRAM process technology, which delivers over 20% improvement in power efficiency and supports optimisation of large-scale data centre power design.

author-image
Voice&Data Bureau
New Update
Socamm2 img 4

As artificial intelligence continues to drive rapid innovation and expansion, the global data centre ecosystem is shifting towards more energy-efficient infrastructure to sustain long-term growth. With memory becoming increasingly critical to AI performance, low-power memory solutions are now central to these efforts.

Advertisment

Micron Technology, has announced that it is providing customers with samples of its new 192GB SOCAMM2 (small outline compression attached memory module), designed to enable broader use of low-power memory in AI data centres. The SOCAMM2 builds on Micron’s earlier LPDRAM SOCAMM, offering 50% greater capacity within the same compact form factor. This additional capacity can reduce time to first token (TTFT) by more than 80% in real-time inference workloads.

The 192GB SOCAMM2 is based on Micron’s latest 1-gamma DRAM process technology, which delivers over 20% improvement in power efficiency and supports optimisation of large-scale data centre power design. These efficiency gains can be particularly significant across full-rack AI installations, which may include more than 40 terabytes of CPU-attached low-power DRAM. The module’s design also enhances serviceability and provides a path for future capacity increases.

Expanding low-power memory for AI workloads

Building on a five-year collaboration with NVIDIA, Micron has been at the forefront of integrating low-power server memory into data centre environments. The SOCAMM2 brings the inherent advantages of LPDDR5X technology, high bandwidth and low energy consumption, to the main memory of AI systems.

Advertisment

Designed for the large-context demands of modern AI models, SOCAMM2 delivers the high data throughput required for both AI training and inference while maintaining strong energy efficiency. This combination is intended to support the evolving requirements of next-generation AI infrastructure.

Technical improvements and data centre applications

Through specialised design and enhanced testing, Micron has adapted low-power DRAM, originally developed for mobile devices, into a solution capable of meeting the stringent performance, quality, and reliability standards of data centre operations. The company’s experience in high-quality DDR memory has informed this transition.

According to Micron, SOCAMM2 modules improve power efficiency by more than two-thirds compared with equivalent RDIMMs, while delivering comparable performance in a unit one-third of the size. This helps optimise the physical footprint of data centres and enables higher memory capacity and bandwidth within existing infrastructure. The module’s modular architecture and stacking technology also improve maintenance efficiency and support the design of liquid-cooled server systems.

Advertisment

Micron has participated actively in defining the JEDEC SOCAMM2 standard and is collaborating with industry partners to promote wider adoption of low-power memory solutions across AI data centres. The company stated that customer samples of SOCAMM2 are now available in capacities of up to 192GB per module and speeds reaching 9.6Gbps, with volume production scheduled to align with customer deployment timelines.