Rambus recently announced the availability of its new High Bandwidth Memory (HBM) Gen2 PHY. Designed for systems that require low latency and high bandwidth memory, the Rambus HBM PHY, built on the ...
A New Class of Memory for the AI Era” was published by researchers at Microsoft. Abstract “AI clusters today are one of the ...
US semiconductor giant discloses HBM4 product launch in 2026, followed by HBM4E These are likely to be used by Nvidia's Rubin R100 GPU and AMD's successor to the Instinct MI400x Micron is a ...
The different flavors of DRAM each fill a particular AI niche.
SanDisk, Kioxia, SK hynix, Micron, Samsung and Macronix were showing their storage and memory solutions at the 2025 CES.
TL;DR: SK hynix will showcase its new 16-Hi HBM3E memory chips at CES 2025, featuring up to 48GB capacity. These chips use an advanced manufacturing process to enhance performance and control warpage.
Today, Micron Technology has started constructing its multi-billion-dollar packaging facility for high-bandwidth memory (HBM) in Singapore. The company will invest $7 billion in the plant, as it ...
TL;DR: SK hynix will showcase its AI memory technologies at CES 2025, featuring solutions for on-device AI and next-generation AI memories. The company aims to highlight its technological ...
SK Hynix (or "the company") announced today that it will showcase its innovative AI memory technologies at CES 2025, to be held in Las Vegas from January 7 to 10 (local time). A large number of C ...