SK Hynix Speeds Up Release of New AI Chips

In an industry-first, SK hynix has announced its 16-Hi HBM3E memory, offering capacities of 48GB per stack alongside other ...
SK Hynix plans to provide samples of its 48GB, 16-layer HBM product—the industry's largest capacity and highest layer ...
SK hynix unveils the industry's first 16-Hi HBM3E memory, offering up to 48GB per stack for AI GPUs with even more AI memory in the future.
At the SK AI Summit 2024, SK hynix CEO Kwak Noh-Jung unveiled the worlds first 16-high 48GB HBM3E memory solution, pushing AI memory capabilities to unprecedented levels. The advanced HBM3E solution ...
Samsung Electronics and SK Hynix, the world's two largest memory chipmakers, brought their latest products to this year's China International Import Expo (CIIE) in Shanghai, as demand continues to ...
NVIDIA currently uses SK hynix's HBM3E memory for its AI chips and plans to use HBM4 in its upcoming Rubin R100 AI GPU.