The company’s Q4 revenue increased by 12%, and its operating profit rose by 15%, making for its best quarter ever, SK Hynix ...
While Micron is a relative newcomer to the HBM space, which is currently dominated by South Korean memory giant SK Hynix, and its neighbor and chief rival Samsung, the company remains optimistic ...
A New Class of Memory for the AI Era” was published by researchers at Microsoft. Abstract “AI clusters today are one of the ...
Rambus recently announced the availability of its new High Bandwidth Memory (HBM) Gen2 PHY. Designed for systems that require low latency and high bandwidth memory, the Rambus HBM PHY, built on the ...
as well as representative AI memory products such as HBM and eSSD at this CES. Through this, we will publicize our technological competitiveness to prepare for the future as a Full Stack AI Memory ...
This blog explores three leading memory solutions—HBM, LPDDR, and GDDR—and their suitability for AI accelerators. High Bandwidth Memory (HBM): The ultimate choice for AI training Generative AI and ...
SanDisk, Kioxia, SK hynix, Micron, Samsung and Macronix were showing their storage and memory solutions at the 2025 CES.
3 Low Power Compression Attached Memory Module 2 (LPCAMM2): LPDDR5X-based module ... and SK Hynix will produce 6 th generation HBM (HBM4) in the second half of this year to lead the customized ...
It will present products like HBM, eSSD, and innovations in data processing ... along with modularized versions, CMM (CXL Memory Module)-Ax and AiMX, designed to be core infrastructures for ...