SK hynix Delivers First 12-Layer HBM4 Samples to Market


March 19, 2025 by our News Team

SK hynix introduces 12-layer HBM4, setting a new standard for ultra-high performance AI memory with industry-leading capacity and speed.

  • Industry-leading capacity and speed
  • Innovative technology for better heat dissipation and reliability
  • Commitment to leading the AI memory market


SK hynix Takes a Bold Step in AI Memory

In an exciting development for the tech world, SK hynix Inc. has just announced that it’s shipped samples of its 12-layer HBM4, marking a significant leap in ultra-high performance DRAM specifically designed for AI applications. This isn’t just any ordinary memory chip; it’s the first of its kind to hit the market. Talk about being ahead of the curve!

A Glimpse into the Future of Memory Technology

So, what’s the big deal about this 12-layer HBM4? For starters, these samples boast the industry’s best capacity and speed, which are crucial for AI memory products. Imagine processing over 2 terabytes of data per second—yes, you read that right! That’s equivalent to streaming more than 400 full-HD movies in just one second. If that doesn’t blow your mind, consider that it’s over 60% faster than its predecessor, the HBM3E. Now that’s a game-changer!

Innovative Technology at Work

What’s behind this impressive performance? SK hynix has employed the Advanced MR-MUF process, which not only maximizes product stability but also prevents chip warpage. This means better heat dissipation and overall reliability—two key factors when it comes to high-performance memory. With a capacity of 36 GB, this new chip is setting the bar high for 12-layer HBM products.

Leading the Charge in AI Memory

SK hynix is no stranger to innovation. After becoming the first company to mass-produce HBM3 in 2022 and furthering its lineup with 8- and 12-high HBM3E models in 2024, it’s clear the company is committed to leading the AI memory market. Justin Kim, President & Head of AI Infra at SK hynix, expressed confidence in their journey: “We have enhanced our position as a front-runner in the AI ecosystem following years of consistent efforts to overcome technological challenges in accordance with customer demands.”

What’s Next for SK hynix?

As SK hynix gears up for the certification process with its customers, the company aims to kick off mass production of the 12-layer HBM4 in the latter half of the year. With its rich experience as the largest HBM provider in the industry, they’re poised to make a significant impact on the next-gen AI memory landscape.

In a world where speed and capacity are king, SK hynix is not just keeping pace; they’re setting the tempo. So, what do you think? Are we ready to embrace this new era of memory technology?

SK hynix Delivers First 12-Layer HBM4 Samples to Market

About Our Team

Our team comprises industry insiders with extensive experience in computers, semiconductors, games, and consumer electronics. With decades of collective experience, we’re committed to delivering timely, accurate, and engaging news content to our readers.

Background Information


About SK hynix:

SK Hynix is a important South Korean semiconductor company known for its innovative contributions to the global technology landscape. Specializing in the production of memory solutions, SK Hynix has played a vital role in shaping the semiconductor industry. With a commitment to research and development, they have continuously pushed the boundaries of memory technology, resulting in products that power various devices and applications.

SK hynix website  SK hynix LinkedIn
Latest Articles about SK hynix

Technology Explained


HBM3E: HBM3E is the latest generation of high-bandwidth memory (HBM), a type of DRAM that is designed for artificial intelligence (AI) applications. HBM3E offers faster data transfer rates, higher density, and lower power consumption than previous HBM versions. HBM3E is developed by SK Hynix, a South Korean chipmaker, and is expected to enter mass production in 2024. HBM3E can achieve a speed of 1.15 TB/s and a capacity of 64 GB per stack. HBM3E is suitable for AI systems that require large amounts of data processing, such as deep learning, machine learning, and computer vision.

Latest Articles about HBM3E




Leave a Reply