Intel Granite Rapids-SP Boosts Cache Capacity, Outshining AMD’s Epyc Genoa CPUs: 480MB vs 384MB

January 21, 2024

Intel's upcoming 6th Gen Xeon Granite Rapids-SP processors will feature a significant boost in cache capacity and advanced architecture, positioning them as a strong competitor in the data center market and a key test of the company's roadmap execution under CEO Pat Gelsinger.

  • Significant boost in cache capacity, making it 1.5 times more than its predecessor
  • Utilizes a complex chiplet packaging, allowing for more efficient performance
  • Focus on meeting the demands of AI applications and data centers, making it more appealing to AI-focused companies and startups

Intel is gearing up for the launch of its 6th Gen Xeon Granite Rapids-SP processors, set to hit the market in late 2024. The chipmaker has been busy releasing multiple generations of its data center processors throughout 2023, but the upcoming release of Granite Rapids-SP will be a true test of Intel’s roadmap execution under the leadership of CEO Pat Gelsinger.

One of the key improvements in the Granite Rapids-SP lineup is the significant boost in cache capacity. According to the latest Intel SDE, the L3 cache on these processors will be increased to 480MB, which is 1.5 times more than its predecessor, Emerald Rapids. This move aligns with Intel’s strategy to make its Xeon processors more suitable for generative AI applications that require large cache reserves as training models continue to grow in size.

In addition to the increased cache capacity, the Granite Rapids-SP processors will feature the Lion Cove P-core architecture and will be built on the Intel 3 process node. The chip will also utilize a complex chiplet packaging. Leaked information suggests that the processors will have a core count of up to 56 and a stock memory speed of 6,400 MT/s. Early samples indicate a base clock speed between 1.2 to 1.5GHz and a boost clock speed of 2.6GHz.

The launch of Granite Rapids-SP is expected to take place in the second half of 2024, following the release of Sierra Forest. Sierra Forest will feature up to 144 Crestmont “E” cores and 108MB of L3 cache. Both Granite Rapids-SP and Sierra Forest will leverage the LGA4710 socket and have a TDP of 350W.

Intel’s focus on increasing cache capacity and improving core architectures demonstrates its commitment to meeting the demands of AI applications and data centers. By offering processors with larger cache reserves, Intel aims to make its Xeon lineup more appealing to AI-focused companies and startups.

As the launch of Granite Rapids-SP approaches, industry experts and enthusiasts eagerly await the performance benchmarks and real-world applications of these processors. It will be interesting to see how Intel’s latest offering compares to AMD’s EPYC Genoa CPUs, as cache capacity is a crucial factor in determining overall performance.

Overall, Intel’s Granite Rapids-SP processors are poised to make a significant impact in the data center market, with their increased cache capacity and advanced architecture. The late 2024 launch will be a crucial moment for Intel as it seeks to solidify its position in the competitive landscape of data center processors.


Background Information

About AMD: AMD, a large player in the semiconductor industry is known for its powerful processors and graphic solutions, AMD has consistently pushed the boundaries of performance, efficiency, and user experience. With a customer-centric approach, the company has cultivated a reputation for delivering high-performance solutions that cater to the needs of gamers, professionals, and general users. AMD's Ryzen series of processors have redefined the landscape of desktop and laptop computing, offering impressive multi-core performance and competitive pricing that has challenged the dominance of its competitors. Complementing its processor expertise, AMD's Radeon graphics cards have also earned accolades for their efficiency and exceptional graphical capabilities, making them a favored choice among gamers and content creators. The company's commitment to innovation and technology continues to shape the client computing landscape, providing users with powerful tools to fuel their digital endeavors.

AMD website  AMD LinkedIn

About Intel: Intel Corporation, a global technology leader, is for its semiconductor innovations that power computing and communication devices worldwide. As a pioneer in microprocessor technology, Intel has left an indelible mark on the evolution of computing with its processors that drive everything from PCs to data centers and beyond. With a history of advancements, Intel's relentless pursuit of innovation continues to shape the digital landscape, offering solutions that empower businesses and individuals to achieve new levels of productivity and connectivity.

Intel website  Intel LinkedIn

Technology Explained

chiplet: Chiplets are a new type of technology that is revolutionizing the computer industry. They are small, modular components that can be used to build powerful computing systems. Chiplets are designed to be used in combination with other components, such as processors, memory, and storage, to create a complete system. This allows for more efficient and cost-effective production of computers, as well as more powerful and versatile systems. Chiplets can be used to create powerful gaming PCs, high-end workstations, and even supercomputers. They are also being used in the development of artificial intelligence and machine learning applications. Chiplets are an exciting new technology that is changing the way we build and use computers.

EPYC: EPYC is a technology designed by computer chip manufacturer AMD for use in the server and data center industry. It was introduced in June 2017 and features an innovative design to improve performance and power efficiency. EPYC processor technology is based on an innovative 14nm processor architecture, allowing up to 32 high-performance cores in a single socket. This allows for more efficient processing power, increased memory bandwidth, and greater compute density. EPYC is now widely used in the data center and cloud computing industry and provides benefits such as greater scalability, increased resource efficiency, and advanced virtualization capabilities. Additionally, EPYC technology is used in data intensive servers like server farms, gaming, and virtualization platforms. EPYC ensures that even with large deployments in multi-processor environments, power consumption and performance levels are optimized to ensure maximum efficiency.

L3 cache: L3 cache is a type of computer memory that is used to store frequently used data and instructions. It is located between the processor and main memory, and is used to reduce the amount of time it takes for the processor to access data from main memory. This helps to improve the overall performance of the computer. In the computer industry, L3 cache is used in many applications, such as gaming, video editing, and web browsing. It can also be used to improve the performance of servers and other high-performance computing tasks. By providing faster access to data, L3 cache can help to reduce the amount of time it takes for a computer to complete a task.

Xeon: The Intel Xeon processor is a powerful and reliable processor used in many computer systems. It is a multi-core processor that is designed to handle multiple tasks simultaneously. It is used in servers, workstations, and high-end desktop computers. It is also used in many embedded systems, such as routers and switches. The Xeon processor is known for its high performance and scalability, making it a popular choice for many computer applications. It is also used in many cloud computing applications, as it is capable of handling large amounts of data and providing high levels of performance. The Xeon processor is also used in many scientific and engineering applications, as it is capable of handling complex calculations and simulations.

Leave a Reply