With doubled I/O interfaces and refined low voltage TSV design, HBM4 reshapes how memory stacks sustain throughput under data ...
SK Hynix and Taiwan’s TSMC have established an ‘AI Semiconductor Alliance’. SK Hynix has emerged as a strong player in the high-bandwidth memory (HBM) market due to the generative artificial ...
Designed to take on high-bandwidth memory in data centers, Z-Angle memory (ZAM) leverages diagonal interconnects for improved ...
TL;DR: SK hynix CEO Kwak Noh-Jung unveiled the "Full Stack AI Memory Creator" vision at the SK AI Summit 2025, emphasizing collaboration to overcome AI memory challenges. SK hynix aims to lead AI ...
An 8-layer stack of bulk RRAM runs a continually-learning neural network ...
High Bandwidth Memory (HBM) is the commonly used type of DRAM for data center GPUs like NVIDIA's H200 and AMD's MI325X. High Bandwidth Flash (HBF) is a stack of flash chips with an HBM interface. What ...
The mighty GPU is shaping up to be one of the most significant innovations of human technology during the last few decades. While games such as Alan Wake 2 demonstrate visual chops that can often ...
Forbes contributors publish independent expert analyses and insights. This article discusses memory and chip and system design talks at the 2025 AI Infra Summit in Santa Clara, CA by Kove, Pliops and ...
Walk into any modern AI lab, data center, or autonomous vehicle development environment, and you’ll hear engineers talk endlessly about FLOPS, TOPS, sparsity, quantization, and model scaling laws.
Samsung Electronics is currently dedicated to the R&D of 11nm level DRAM and ninth-generation 3D NAND Flash. The goal is to raise integration to the highest level in the industry, realize its existing ...
Samsung's new codename Shinebolt HBM3e memory features 12-Hi 36GB HBM3e stacks with 12 x 24Gb memory devices placed on a logic die featuring a 1024-bit memory interface. Samsung's new 36GB HBM3e ...