NVIDIA GeForce RTX 5090: Unveiling 2.9 GHz Boost Clock, 1.5 TB/s Bandwidth, and 128MB L2 Cache


September 19, 2023 by our News Team

NVIDIA's upcoming GeForce RTX 5090 gaming GPU is rumored to offer a 50% increase in scale, 52% increase in memory bandwidth, 78% increase in cache, 15% increase in frequency, and 1.7x improvement in performance compared to the RTX 4090.

  • The RTX 5090 is rumored to offer a 50% increase in scale, a 52% increase in memory bandwidth, a 78% increase in cache, a 15% increase in frequency, and a 1.7x improvement in performance.
  • The RTX 50 series might feature GDDR7 technology.
  • A 15% increase in frequency would translate to a 2.9 GHz boost clock.


Geforce RTX 5090 rumors are gaining traction as details about nVidia’s next-gen consumer GPU lineup continue to emerge. The initial leaks came from a reliable source known as Panzerlied on Chiphell, who revealed that NVIDIA would be skipping the XXX04-class GPU in their upcoming gaming product series. This information was later confirmed by Kopite7kimi, a respected NVIDIA insider.

Now, Kopite7kimi has shared more insights into the Blackwell series, which is the codename for NVIDIA’s next-generation GPU lineup. It appears that Blackwell will cover both data-center and gaming series, with separate naming schemes for each. The high-performance computing (HPC) GPUs will be designated as GB1XX, while the gaming GPUs will fall under GB2XX.

Panzerlied has also provided some details about the improvements we can expect from the next-generation NVIDIA lineup. Instead of specific numerical values, Panzerlied has shared percentage increases across various aspects of the Blackwell family.

For instance, the rumored NVIDIA RTX 5090 is said to offer a 50% increase in scale (presumably cores), a 52% increase in memory bandwidth, a 78% increase in cache (presumably L2 cache), a 15% increase in frequency (presumably GPU boost), and a 1.7x improvement in performance.

It’s important to note that these claims are in reference to the RTX 4090 specs, not the AD102. If we consider that the RTX 4090 with 21 Gbps memory could see an upgrade to 32 Gbps (a 52.4% increase), it suggests that the RTX 50 series might feature GDDR7 technology. However, it’s unlikely that NVIDIA would adopt the fastest GDDR7 memory right from the start, so other configurations like 512-bit/24 Gbps or 448-bit/28 Gbps could also be possibilities.

Assuming the other claims are also based on the RTX 4090 as a reference point, a 15% increase in frequency would translate to a 2.9 GHz boost clock, with actual workloads potentially achieving even higher clocks. Additionally, a 78% increase in cache suggests that the GB202 GPU would feature 128MB of L2 cache.

While we are still more than a year away from the potential launch of the next-gen GeForce series, it’s interesting to speculate on what a potential RTX 5090 GPU might look like based on these rumored specifications. However, it’s important to treat these details as rumors until official announcements are made.

Source: Chiphell

About Our Team

Our team comprises industry insiders with extensive experience in computers, semiconductors, games, and consumer electronics. With decades of collective experience, we’re committed to delivering timely, accurate, and engaging news content to our readers.

Background Information


About nVidia: NVIDIA has firmly established itself as a leader in the realm of client computing, continuously pushing the boundaries of innovation in graphics and AI technologies. With a deep commitment to enhancing user experiences, NVIDIA's client computing business focuses on delivering solutions that power everything from gaming and creative workloads to enterprise applications. for its GeForce graphics cards, the company has redefined high-performance gaming, setting industry standards for realistic visuals, fluid frame rates, and immersive experiences. Complementing its gaming expertise, NVIDIA's Quadro and NVIDIA RTX graphics cards cater to professionals in design, content creation, and scientific fields, enabling real-time ray tracing and AI-driven workflows that elevate productivity and creativity to unprecedented heights. By seamlessly integrating graphics, AI, and software, NVIDIA continues to shape the landscape of client computing, fostering innovation and immersive interactions in a rapidly evolving digital world.

nVidia website  nVidia LinkedIn

Technology Explained


GDDR7: GDDR7 (Graphics Double Data Rate 7) is the seventh generation of graphics double data rate (GDDR) memory. It is a type of dynamic random-access memory (DRAM) that is specifically designed for use in graphics cards. GDDR7 memory offers a number of advantages over previous generations of GDDR memory. GDDR7 is a significant improvement over previous generations of GDDR memory. It offers faster speeds up to 32 gigabits per second (Gbps) per pin, lower power consumption, and improved error correction. This makes it ideal for use in high-performance graphics cards and other applications that require high bandwidth and low latency.


Geforce: Geforce is a line of graphics processing units (GPUs) developed by Nvidia. It is the most popular GPU used in the computer industry today. Geforce GPUs are used in gaming PCs, workstations, and high-end laptops. They are also used in virtual reality systems, artificial intelligence, and deep learning applications. Geforce GPUs are designed to deliver high performance and power efficiency, making them ideal for gaming and other demanding applications. They are also capable of rendering high-resolution graphics and providing smooth, realistic visuals. Geforce GPUs are used in a variety of applications, from gaming to professional workstations, and are the preferred choice for many computer users.


GPU: GPU stands for Graphics Processing Unit and is a specialized type of processor designed to handle graphics-intensive tasks. It is used in the computer industry to render images, videos, and 3D graphics. GPUs are used in gaming consoles, PCs, and mobile devices to provide a smooth and immersive gaming experience. They are also used in the medical field to create 3D models of organs and tissues, and in the automotive industry to create virtual prototypes of cars. GPUs are also used in the field of artificial intelligence to process large amounts of data and create complex models. GPUs are becoming increasingly important in the computer industry as they are able to process large amounts of data quickly and efficiently.





Leave a Reply