Kioxia announced its ultra-fast GP SSD series for AI workloads at the 2026 GTC.  Micron, Samsung and Phison also had their ...
The South Korean-based RAM maker’s chairman, Chey Tae-won, told Bloomberg that the company is expanding memory-making ...
Micron confirms AI-optimized memory and storage technologies are in production - HBM4 memory, SOCAMM2, and PCIe Gen6 SSDs - ...
In an era where data-intensive applications, from AI and machine learning to high-performance computing (HPC) and gaming, are pushing the limits of traditional memory architectures, High Bandwidth ...
The demand for high bandwidth memory (HBM) is accelerating across the semiconductor industry, driven by boundary-pushing artificial intelligence, high-performance computing, and advanced graphics.
We've had previous forecasts that the RAM crisis will continue through to 2028 – but 2030 is upping the ante worryingly here.
Samsung Electronics has signed a Memorandum of Understanding (MoU) with AMD that will see the partners collaborate on AI ...
High Bandwidth Memory (HBM) is the commonly used type of DRAM for data center GPUs like NVIDIA's H200 and AMD's MI325X. High Bandwidth Flash (HBF) is a stack of flash chips with an HBM interface. What ...
(CNN) — The US government has imposed fresh export controls on the sale of high tech memory chips used in artificial intelligence (AI) applications to China. The rules apply to US-made high bandwidth ...
Autonomous cars will need 300 gigabytes of DRAM or more, and robots will need similar quantities, leading memory-maker Micron ...
NAND and RAM costs are likely to go up, but cheap Chinese exports could save the day.