Fastest: SOCAMMs provide over 2.5 times higher bandwidth at the same capacity when compared to RDIMMs, allowing faster access to larger training datasets and more complex models, as well as increasing throughput for inference workloads.2Smallest: At 14x90mm, the innovative SOCAMM form factor occupies one-third of the size of the industry-standard RDIMM form factor, enabling compact, efficient server design.3Lowest power: Leveraging LPDDR5X memory, SOCAMM products consume one-third the power compared to standard DDR5 RDIMMs, inflecting the power performance curve in AI architectures.4Highest capacity: SOCAMM solutions use four placements of 16-die stacks of LPDDR5X memory to enable a 128GB memory module, offering the highest capacity LPDDR5X memory solution, which is essential for advancements towards faster AI model training and increased concurrent users for inference workloads. Optimized scalability and serviceability: SOCAMM’s modular design and innovative stacking technology improve serviceability and aid the design of liquid-cooled servers.
Industry-leading HBM solutionsMicron continues its competitive lead in the AI industry by offering 50% increased capacity over the HBM3E 8H 24GB within the same cube form factor.5 Additionally, the HBM3E12H 36GB provides up to 20% lower power consumption compared to the competition's HBM3E 8H 24GB offering, while providing 50% higher memory capacity.6 By continuing to deliver exceptional power and performance metrics, Micron aims to maintain its technology momentum as a leading AI memory solutions provider through the launch of HBM4.
Micron Media Relations Contact Kelly Sasso Micron Technology, Inc. +1 (208) 340-2410 ksasso@micron.com ___________________1 Calculations based on comparing one 64GB 128-bit bus SOCAMM to two 32GB 64-bit bus RDIMMs.2 Calculated using transfer speeds comparing 64GB 2R 8533MT/s SOCAMM and 64GB 2Rx4 6400MT/s RDIMMs. 3 Calculated area between one SOCAMM and one RDIMM.4 Calculated based on power used in watts by one 128GB, 128-bit bus width SOCAMM compared to two 128GB, 128-bit bus width DDR5 RDIMMs.5 Comparison based on HBM3E 36GB capacity versus HBM3E 24GB capacity when both are at the 12x10mm package size.6 Based on internal calculations, and customer testing and feedback for Micron HBM3E versus the competition’s HBM3E offerings.7 Calculated bandwidth by comparing HBM4 and HBM3E specifications.8 Assumes 20x 61.44TB E3.
The story "Micron Innovates From the Data Center to the Edge With NVIDIA" has 951 words across 32 sentences, which will take approximately 4 - 8 minutes for the average person to read.
Which news outlet covered this story?
The story "Micron Innovates From the Data Center to the Edge With NVIDIA" was covered 19 hours ago by GlobeNewswire, a news publisher based in China.
How trustworthy is 'GlobeNewswire' news outlet?
GlobeNewswire is a fully independent (privately-owned) news outlet established in 1998 that covers mostly technology news.
The outlet is headquartered in China and publishes an average of 59 news stories per day.
It's most recent story was published 8 hours ago.
What do people currently think of this news story?
The sentiment for this story is currently Negative, indicating that people regard this as "bad news".
How do I report this news for inaccuracy?
You can report an inaccurate news publication to us via our contact page. Please also include the news #ID number and the URL to this story.