Samsung pioneers 36GB HBM3E 12H DRAM, revolutionizing the industry with unprecedented capacity.


March 2, 2024 by our News Team

Samsung's 12-stack HBM3E DRAM, HBM3E 12H, offers record-breaking bandwidth and highest capacity of its kind, aiming to meet the increasing demand for higher-capacity HBM from AI service providers and advance Samsung's position in the high-capacity HBM market.

  • The industry's first 12-stack HBM3E DRAM
  • Record-breaking bandwidth of up to 1,280 GB/s
  • Capacity of 36 GB, the highest-capacity HBM product to date


Samsung Electronics has announced the development of the industry’s first 12-stack HBM3E DRAM, the HBM3E 12H. This new memory solution offers a record-breaking bandwidth of up to 1,280 gigabytes per second (GB/s) and a capacity of 36 gigabytes (GB), making it the highest-capacity HBM product to date.

YongCheol Bae, Executive Vice President of Memory Product Planning at Samsung Electronics, highlighted the increasing demand for higher-capacity HBM from AI service providers. He stated that the HBM3E 12H was designed to meet this need and solidify Samsung’s position as a technological leader in the high-capacity HBM market in the AI era.

The HBM3E 12H incorporates advanced thermal compression non-conductive film (TC NCF) technology, enabling the 12-layer products to have the same height specification as 8-layer ones. This is crucial for meeting current HBM package requirements and mitigating chip die warping. Samsung has also made significant advancements in reducing the thickness of its NCF material and eliminating voids between layers, resulting in enhanced vertical density.

Moreover, Samsung’s TC NCF technology improves the thermal properties of the HBM by allowing the use of bumps in various sizes between the chips. This enables better heat dissipation and higher product yield during the chip bonding process.

As AI applications continue to grow, the HBM3E 12H is expected to be an optimal solution for future systems that require more memory. Its higher performance and capacity will allow customers to manage their resources more flexibly and reduce total cost of ownership (TCO) for data centers. In fact, compared to adopting HBM3 8H, using the HBM3E 12H can increase the average speed for AI training by 34% and expand the number of simultaneous users of inference services by more than 11.5 times.

Samsung has already started sampling the HBM3E 12H to customers, with mass production scheduled for the first half of this year.

About Our Team

Our team comprises industry insiders with extensive experience in computers, semiconductors, games, and consumer electronics. With decades of collective experience, we’re committed to delivering timely, accurate, and engaging news content to our readers.

Background Information


About Samsung:

Samsung, a South Korean multinational conglomerate, has established itself as a global leader in various industries, including electronics, technology, and more. Founded in 1938, Samsung's influence spans from smartphones and consumer electronics to semiconductors and home appliances. With a commitment to innovation, Samsung has contributed products like the Galaxy series of smartphones, QLED TVs, and SSDs that have revolutionized the way we live and work.

Samsung website  Samsung LinkedIn
Latest Articles about Samsung

Technology Explained


HBM3E: HBM3E is the latest generation of high-bandwidth memory (HBM), a type of DRAM that is designed for artificial intelligence (AI) applications. HBM3E offers faster data transfer rates, higher density, and lower power consumption than previous HBM versions. HBM3E is developed by SK Hynix, a South Korean chipmaker, and is expected to enter mass production in 2024. HBM3E can achieve a speed of 1.15 TB/s and a capacity of 64 GB per stack. HBM3E is suitable for AI systems that require large amounts of data processing, such as deep learning, machine learning, and computer vision.

Latest Articles about HBM3E




Leave a Reply