This is our first look at Nvidia's GeForce 40 series, starting with the flagship GPU, the GeForce RTX 4090. Today we're about to find out all you need to know about this next generation graphics card – most importantly its gaming performance.

The new GeForce RTX 4090 is baked on the AD102 silicon which measures 608.4 mm2, about 3% smaller than the GA102 used by the RTX 3080 and 3090. Nvidia has moved from the Samsung 8N process with Ampere to TSMC's 4N process for Ada Lovelace. Quite incredibly, this has seen transistor count increase by 170% from 28.3 billion to an insane 76.3 billion.

When compared to the RTX 3090 Ti, there's 52% more streaming multiprocessors, CUDA cores, Tensor cores and RT cores and texture units. The ROP count has increased by 57% and the boost clock has been wound up by 35%. The same 21 Gbps GDDR6X memory is in use and there's still 24GB of it on a 384-bit wide memory bus, resulting in the same 1008 GB/s of memory bandwidth. The GPU also keeps the same the PCI Express 4.0 x16 interface.

Nvidia claims a total graphics power rating of 450 watts for the RTX 4090, the same rating given to the 3090 Ti, though the maximum GPU temperature has been slightly downgraded from 93C to 90C. The minimum power supply requirement is 850 watts, which is what we'll be using for all of our testing.

In terms of design, the Founders Edition 4090 looks similar to the 3090 with some fairly significant changes. The most noticeable is the graphics card's width. On paper, Nvidia claims a triple-slot form factor for both models which is accurate, but where the 3090 measures 52mm wide, the new RTX 4090 is 17% wider at 61mm. Despite the size increase, both models weigh about the same at 2190 grams.

That's surprising, but what's not all that surprising is the absence of the NVLink connector which Nvidia has now killed off and will instead rely on the PCIe 4.0 bus. The only other major change is the 16-pin power input which has been upgraded to the PCIe 5.0 spec, otherwise known as the super catch 12VHPWR power connector.

A single PCIe 5.0 power connector can deliver up to 600 watts, whereas previously that would require four 8-pin power connectors. You won't require a new PCIe 5.0 compliant PSU though as the RTX 4090 comes with a 4x 8-pin to single 16-pin adapter, similar to the 3x 8-pin to 16-pin adapter supplied with 3090 Tis.

We should note that it was falsely reported that the 12VHPWR power connector could only survive 30 cycles (30 connect and disconnects), but that's not the case and the connector longevity will be similar to that of the 8-pin connectors, which is to say you're never going to wear one out.

Besides the increase in cores, inclusion of 4th-gen tensor cores and 3rd-gen RT cores, GeForce 40 also introduces DLSS 3, a feature that for now is exclusive to the GeForce 40 series.

This new upscaling/frame rate multiplying technology is exciting and while we'll briefly show some results in this review, our full analysis of DLSS 3 is coming up soon. DLSS 3 requires a significant amount of testing and analysis which is far beyond the scope of a day-one review.

For testing, all GPUs were set to run at the official clock specifications (no factory overclocking), the CPU powering the test system is the Ryzen 7 5800X3D with 32GB of dual-rank, dual-channel DDR4-3200 CL14 memory on the MSI MPG X570S Carbon Max WiFi motherboard.

Benchmarks

Starting with Watch Dogs: Legion at 1440p, the GeForce RTX 4090 doesn't look that impressive here (...wait for it). Sure, it's the fastest GPU we've ever seen but a 9% boost over the 6950 XT is hardly awe-inspiring.

We saw a more substantial 22% uplift over the RTX 3090 Ti, but given today's GPU prices that's not amazing. Having said that, the issue appears to be less with the RTX 4090 and more with the 5800X3D which has become the primary system bottleneck...

Therefore when increasing the resolution to 4K we get to see how brutally fast the RTX 4090 really is. We're now looking at a monstrous 60% boost over the 6950 XT and a 64% uplift from the RTX 3090 Ti, that's unbelievable and a truly high refresh rate experience at 4K.

It appears as though we have a lot more CPU headroom in Rainbow Six Extraction at 1440p and so the RTX 4090 is 59% faster than the RTX 3090 Ti and 71% faster than the RTX 3090 and 6950 XT. This is a serious performance uplift.

Jumping up to 4K didn't change the results radically as we weren't CPU limited at 1440p. The RTX 4090 was once again ~60% faster than the 3090 Ti, though the margin against the Radeon 6950 XT kept growing and now we're looking at a 102% deficit here for AMD. That's brutal to say the least.

Far Cry 6 tends to like Radeon GPUs and at 1440p we're heavily CPU limited, so the RTX 4090 was only able to push forward by a 9% margin, rendering 187 fps on average over the 6950 XT, though the 1% lows were boosted by 13%.

Jumping up to 4K unleashes the RTX 4090, resulting in massive gains and averaging an impressive 164 fps. That's a 34% uplift from the 6950 XT, very much a commanding lead.

The jump up from the RTX 3090 Ti was more sizable as we're looking at a 50% boost and almost 60% over the standard 3090.

Assassin's Creed Valhalla is another title that works well with Radeon and the excellent support for Resizable BAR certainly helps the red team. As a result, at 1440p the RTX 4090 was 27% faster than the 6950 XT which is a decent gain but far less stunning than other margins we've seen so far.

It was also just a 37% improvement from the 3090 Ti and not the typical 60% margin we see at 4K.

Speaking of which, at 4K the RTX 4090 rendered 116 fps on average making it 33% faster than the 6950 XT and 38% faster than the 3090 Ti.

Hunt: Showdown fans going after ultimate performance will love what the RTX 4090 has to offer. Serving up over 300 fps at 1440p with 1% lows of over 260 fps.

That's a massive 62% increase from the 3090 Ti, 67% from the standard 3090 and a 71% increase over AMD's 6950 XT. The RTX 4090 is super fast again, even at 1440p where we've often run into CPU limitations with the 5800X3D.

Unexpectedly, the margins shrink at 4K and the RTX 4090 is 50% faster than the 3090 Ti and 62% faster than the 3090. The margin to the 6950 XT remained the same at 70%. Those are big margins and it means that those wanting to play Hunt Showdown at 4K will receive a high refresh rate experience without reservation with the RTX 4090.

The Outer Worlds saw a 55% performance increase for the RTX 4090 over the 3090 Ti at 1440p and a mega 71% increase over the 6950 XT. Not sure you need 268 fps in The Outer Worlds, but it's now possible using the highest quality preset at 1440p.

The 4K results are just as impressive, even if the gap to the 3090 Ti has shrunk slightly to 51%. That's because we're now able to receive a truly high refresh rate experience at 4K when using the RTX 4090, whereas most other GPUs struggle to hit 100 fps.

Hitman 3 results at 1440p are clearly CPU limited, we often use this title for CPU testing as it is very demanding on the processor. When compared to the RTX 3090 Ti we're looking at a small 7% uplift, so this will be a good configuration for testing CPUs such as Zen 4 and Intel's Raptor Lake.

Moving to 4K shows the expected boost as the RTX 4090 pulls ahead of the 3090 Ti by a massive 60% margin, while pulling a similar margin on the 6950 XT.

Moving on to Horizon Zero Dawn, we find another game where the CPU becomes a bottleneck at 1440p and we're using the highest visual quality preset. About 212 fps is the limit of the 5800X3D when running the new RTX 4090, which is a 20% increase from the 6950 XT and 31% from the 3090 Ti.

4K is required to better demonstrate the power of the RTX 4090 and here the new GeForce GPU was good for 157 fps on average, or 54% faster than the 3090 Ti and 67% over the 6950 XT.

We haven't upgraded our GPU data to F1 22, so for now let's stick with F1 2021 numbers which are based on the maximum quality preset with the default level of ray tracing enabled. Here the GeForce RTX 4090 pumped out 245 fps making it almost 60% faster than the 3090 Ti and 76% faster than the 6950 XT. A mega result at 1440p.

Jumping up to 4K extended the margin as the GeForce RTX 4090 delivered 71% more frames than the 3090 Ti, and an insane 104% more than the 6950 XT. AMD's weak RT performance will be letting down the Radeon GPU no doubt.

We know that Cyberpunk 2077 is both CPU and GPU demanding, but with the RTX 4090 installed we're almost certainly CPU bound at 1440p. Here the RTX 4090 was good for 145 fps on average, making it 33% faster than the 6950 XT and 36% faster than the 3090 Ti.

The margins seen at 4K are not as extreme as we were expecting, but a 51% improvement over the 3090 Ti is nothing to scoff at, especially given the RTX 4090 pushed well past 60 fps while all other GPUs fell short of that target.

In Dying Light 2, the RTX 4090 was 51% faster than the 6950 XT at 1440p and a massive 68% faster than the 3090 Ti. Another incredible generational jump there. And it's much the same story at 4K, 49% faster than the 6950 XT and 58% faster than the 3090 Ti.

Halo Infinite shows fairly typical gains at 1440p: 52% faster than the RTX 3090 Ti and 60% faster than the 6950 XT.

At 4K those margins are extended to 72% faster than the 3090 Ti, and 86% faster than the 6950 XT, which is an extraordinary performance leap over previous generation flagships.

In Shadow of the Tomb Raider for our Zen 4 CPU testing we found this game was GPU limited at 1080p with the RTX 3090 Ti. Moving to the RTX 4090 we see that this will no longer be an issue as this cranked out an insane 237 fps at 1440p, using the highest quality preset.

That's a 55% boost over the 6950 XT and a 59% increase over the 3090 Ti.

Speaking of jaw dropping results, at 4K the RTX 4090 is an incredible 70% faster than the RTX 3090 Ti and 84% faster than the 6950 XT. Pumping out 160 fps at 4K is pretty amazing, more so when you consider previous-gen GPUs couldn't even reach 100 fps.

Performance Summary: 13 Game Average

Here's a look at the 13 game average which is calculated using the geomean. The GeForce RTX 4090 pumped out 219 fps making it on average 44% faster than the 6950 XT and 45% faster than the 3090 Ti.

Those are huge margins but still under-represent the RTX 4090 as the 5800X3D (a pretty fast gaming CPU) was limiting performance on multiple occasions.

It's crazy to think that one of the fastest gaming CPUs out there can be a serious bottleneck for the RTX 4090 at 1440p, and in many cases we're using the highest visual settings for testing.

We may finally be reaching the point where ray tracing becomes truly viable, and we'll look at that in a moment. Before we do, here's the 4K data...

The GeForce RTX 4090 is on average 59% faster than the RTX 3090 Ti and 71% faster than the 6950 XT at 4K gaming. Those are massive margins. Just as impressive is the fact that the RTX 4090 averaged 145 fps at 4K. This is truly the first 4K GPU capable of delivering a high refresh rate gaming experience. Very impressive stuff by Nvidia.

We haven't touched on 1080p results for the 13 games tested, though we did gather all that data even if the results were often heavily CPU limited. Overall, the RTX 4090 was 28% faster than the 3090 Ti at 1080p using the Ryzen 7 5800X3D and 24% faster than the 6950 XT, so it's still faster and if anything, a great tool for testing next-gen CPU performance.

Ray Tracing Performance + DLSS

For ray tracing and upscaling results we'll start with F1 22. At 1440p, the RTX 4090 was just 11% faster than the 3090 Ti and 24% faster than the 6950 XT, when comparing standard rasterization performance (ultra high quality preset but with RT effects disabled).

With RT + DLSS enabled the RTX 4090 was 17% faster than the 3090 Ti and 29% faster than the 6950 XT which was using FSR for upscaling.

Then with just RT enabled with no upscaling involved, the RTX 4090 was 78% faster than the 3090 Ti and 167% faster than the 6950 XT. In short, while the 6950 XT was able to deliver playable performance with more than 60 fps, the RTX 4090 provided a true high refresh rate experience.

The RTX 4090 also affords you the ability to enjoy max visual quality at 4K with RT enabled. Comparing the standard rasterized results, the 4090 was 61% faster than the 3090 Ti, and then 86% faster with RT and DLSS enabled, and a breathtaking 91% faster with just RT and no DLSS.

Going from 54 fps with the 3090 Ti to 103 fps for the RTX 4090 is amazing. That's also over 3x the performance you'll get with the Radeon 6950 XT.

In Watch Dogs: Legion we see a fairly small margin between the 6950 XT and RTX 4090 at 1440p data due to a CPU bottleneck. This game doesn't support FSR and therefore the 6950 XT lacks upscaling support, so we've just tested two RT modes.

Comparing the RTX 3090 Ti and RTX 4090 sees no difference in performance using DLSS with RT enabled. The DLSS bottleneck appears to be 109 fps in this title. Interestingly, the RTX 4090 doesn't even require DLSS as it produced the same performance with ultra quality ray traced effects enabled, making it 33% faster than the 3090 Ti and 91% faster than the 6950 XT.

As expected, we get a better picture of how powerful the RTX 4090 is at 4K. This time using DLSS with ray tracing saw a 43% boost for the 4090 over the 3090 Ti, and then with upscaling disabled the 4090 was 68% faster, and 155% faster than the 6950 XT.

Playing Marvel's Guardians of the Galaxy at 1440p we find another game where DLSS does very little for the already mighty impressive RTX 4090. As a result, when comparing the ray tracing ultra quality + DLSS results between the 3090 Ti and 4090, the new Ada Lovelace GPU was just 13% faster. However, if we ditch the upscaling we see that the 4090 is 47% faster.

At 4K we see bigger margins and this time DLSS is of benefit for the RTX 4090, pushing it ahead of the 3090 Ti by a 54% margin and 131% faster than the 6950 XT using FSR. With no upscaling but RT ultra enabled, the RTX 4090 is 58% faster than the 3090 Ti and 88% faster than the Radeon 6950 XT.

Ray tracing performance in Dying Light 2 is quite brutal, especially without the aid of DLSS. For example, at 1440p the 3090 Ti is only good for 52 fps on average, though that can be bumped up to 86 fps using the DLSS quality mode. Still that means when using DLSS with RT enabled the 4090 was 98% faster than the 3090 Ti and a huge 117% faster without the aid of upscaling.

The 4K data is just as impressive, We're not sure why but the 3090 Ti does poorly using the 'Ray Tracing Quality' preset in Dying Light 2, but with the RT settings turned off. So what should be standard rasterization performance using DX12 sees the 3090 delivering 122% greater performance, again we're sure why that is. Even with DLSS enabled, the RTX 4090 was 110% faster and then 115% faster with upscaling disabled.

In Cyberpunk 2077 at 1440p, the RTX 4090 with RT effects disabled was 36% faster than the 3090 Ti. That margin remains with RT Ultra + DLSS quality enabled, but blows out to a massive 69% margin when using RT without DLSS.

Then at 4K the RTX 4090 crushes the RTX 3090 Ti delivering 72% greater performance with RT and DLSS enabled and 80% with just ray tracing enabled, though we're only looking at 45 fps on average.

DLSS 3.0 on Cyberpunk

Cyberpunk is one of the first games to support DLSS 3, and here's a look at that performance. Please note we are currently preparing a detailed analysis of DLSS 3, which we'll feature on TechSpot very soon.

At 1440p using the high quality preset, the RTX 4090 was good for 145 fps on average in Cyberpunk 2077, making it ~35% faster than both the 3090 Ti and 6950 XT. However, with DLSS 3 enabled, performance was boosted by 90% to 276 fps. With limited testing involved, the game felt smooth, but of course we'll see what we can dig up in our later testing.

The real advantage of DLSS 3.0 can be seen with ultra quality ray tracing enabled as here the RTX 4090 was good for 191 fps on average, making it almost 70% faster than DLSS 2. Then at 4K, the GeForce RTX 4090 pumped out 113 fps using DLSS 3 with ultra RT effects, a 43% boost over DLSS 2.

Power Consumption

We measured total system power consumption when playing Halo at 1440p and as you can see the RTX 4090 actually isn't that bad. In fact, it's very good delivering 50% more performance than the RX 3090 Ti while using less power. Both the RTX 4090 and 3090 Ti are 450w graphics cards, so technically they should consume about the same amount of power.

The possible reason the RTX 3090 Ti can be seen pushing total system usage about 50 watts higher is because there's no Founders Edition of the 3090 Ti, instead we're using an MSI card which uses above spec voltage. Although we've clocked it down to the official Nvidia spec, which shaves a few percent off in terms of performance, the higher voltage means power consumption is still higher than it'd be for a base model.

The key takeaway here is that the RTX 4090 is a 450w product, so power consumption isn't anything we haven't seen before from flagship graphics cards.

Here's a look at just how power efficient the RTX 4090 is, by locking the frame rate at 90 fps we can see how much power each GPU uses.

At this frame rate, the RTX 4090 consumed just 215 watts, and that means for the same level of performance the 3090 Ti required 93% more power and the 6950 XT 40% more power. So despite all the talk of the RTX 4090 being an out of control beast, it's actually very impressive when it comes to efficiency.

Cooling

Now when it comes to cooling, the Founders Edition RTX 4090 peaked at 83C for the hot spot after an hour of load in a 21C room installed inside an ATX case with the doors closed. The average GPU temperature peaked at 72C and the memory temperature peaked at 84C, so all those temperatures are acceptable, especially given the low operating volume.

With a fan speed of just 1600 RPM, the operating volume was just 42 dBA, which is quieter than most high-end and even mid-range graphics cards we've tested in the past. During this test, the core clock speed held pretty steady at 2730 MHz, of course, the memory ran at 21 Gbps, and finally the GPU power draw averaged 415 watts.

Overclocking

We did try some overclocking and were able to push the 4090 Founders Edition cores up to 2895 MHz, for a fairly typical 6% boost, and the memory hit 24.5 Gbps. This increased the GPU power to 470 watts on average which is a 13% increase, though the fan speed didn't change much, but the hotspot temperature did climb to 88C (a 5C increase) and the memory also jumped up by 4C.

Performance when overclocking resulted in a fairly mild 7% boost in Hitman, taking the RTX 4090 from 182 fps up to 194 fps. It's a similar story in Watch Dogs Legion, where the overclocked 4090 was 6% faster going from 141 fps to 150 fps and we also saw a 6% boost in Cyberpunk 2077.

Nothing crazy to report on here and as you'd expect from a product like the RTX 4090, it's pushed pretty hard out of the box already.

Cost per Frame (MSRP)

For our cost per frame analysis, we'll start with the 1440p data using the suggested pricing for each GPU. Do note some of this performance is skewed due to the CPU bottlenecking at this mainstream gaming resolution. Using the $1,600 MSRP, the RTX 4090 looks very good though for such an expensive piece of hardware...

In terms of cost per frame, it's roughly on par with the Radeon 6950 XT and 6900 XT, but over 30% more costly than what we believed to be the sensible options for previous-gen high-end GPUs in the RTX 3080 and 6800 XT – actually it's ~50% more expensive than the 6800 XT.

The 4K cost per frame is more favorable and here the RTX 4090 appears far better value than the Radeon 6950 XT and 6900 XT, offering ~15% greater value. It remains 16% more costly per frame when compared to the RTX 3080 and 21% more costly than the 6800 XT.

Considering the card does have 24 GB of VRAM, when compared to the RTX 3090 we're looking at significantly better value with a saving of 38% per frame.

Cost per Frame (Retail)

Now looking beyond MSRPs and focusing on current retail pricing (Newegg listings), the RTX 4090 is less attractive. Using the 1440p performance data we see that it's slightly worse than the 3090 Ti in terms of value, making it the most expensive GPU on the market in terms of cost per frame and overall price...

The 4K data better represents the GeForce RTX 4090 though, but even here we see that at $1,600 it's slightly more costly than the 6950 XT, though slightly better than the heavily discounted RTX 3090, so that's pretty good for a premium class product.

You're looking at a 21% premium in terms of price to performance over the RTX 3080, which isn't great, but we're comparing a brand new product to a two-year-old GPUs that's now selling below MSRP. So while a valid comparison for those of you looking to buy right now, older products are almost always going to represent better value.

What We Learned

The GeForce RTX 4090 is very fast, brutally fast. A key concern people seemed to have with the RTX 4090 was pricing and power consumption. It's clear now though that the rumors of insane power consumption were wrong, the 4090 isn't absurd in that sense, using less power than the 3090 Ti in our testing.

It's still a very hungry GPU, but it's nothing we haven't seen before from flagship products and given the exceptionally high performance, the performance per watt you're getting with the RTX 4090 is incredibly good. We saw that we when capped frame rates and the RTX 4090 consumed almost 50% less power than the 3090 Ti and almost 30% less than the 6950 XT when gaming.

The GeForce RTX 4090 is very fast, brutally fast.

That leaves the issue of pricing, and that's not an easy one to get around. Based on the 4K data using MSRPs, the RTX 4090 is 16% more costly per frame than the RTX 3080, which isn't far off the margin seen in a previous generation when comparing the RTX 3090 and 2080. That means the new RTX 4090 is similar in terms of value to the 3090, which wasn't a great value, but when looking strictly at ultra high-end GPU pricing, the RTX 4090 isn't terrible.

For people with deep pockets who buy premium-priced GPUs, that's a win. Of course, what made Ampere great in our eyes was the GeForce RTX 3080, but the crypto boom spoiled all that fun. So now the hope is the GeForce RTX 4080 represents better value at a more accessible price point for most gamers, and that's something we'll explore at a later date.

Until more RTX 40 Ada Lovelace GPUs arrive – and of course, competing AMD RDNA3 GPUs – it's hard to say just how good, bad or dumb the value of the RTX 4090 is for this new generation.

Moving past cost per frame or any kind of value analysis to just look at the RTX 4090 for what it is: a brutally fast GPU, how excited should extreme gamers be?

We're honestly very excited about the RTX 4090. It's the first time we've enjoyed a true high refresh rate gaming at 4K without having to compromise on visuals. Ray tracing is finally a carefree option, and while DLSS is still important, enjoying games like Dying Light 2, Cyberpunk 2077 and Watch Dogs Legion at 4K with ultra quality ray tracing, while frame rates stay well above 60 fps is a special experience.

DLSS 3 is an exciting new feature but we'll need to delve more into it before we can comment on that further and how much of a selling point it is.

Bottom line, the gaming experience with the GeForce RTX 4090 Founders Edition is breathtaking, and surprisingly practical as well. We were able to run it just fine with our old Corsair RM850x 850w PSU using the included adapter. The FE model running under full load for extended periods of time is very quiet, no louder than a quality mid-range graphics card, and as we've said power consumption wasn't outrageous.

That's not to say the RTX 4090 is for everyone. It cannot be at $1,600, but we won't discourage those who plan on buying an RTX 4090 from doing so. There's clearly a market for these extreme high-end GPUs, and if enough people are willing to part with that kind of money for the RTX 4090, then this segment will continue to exist. By the way, based on what we're hearing you shouldn't have too much trouble buying a 4090, should you decide to go that way, stock is apparently plentiful.

For everybody else (and me personally), I can enjoy gaming just as much with a GPU that costs 2x or 4x less, and if you're in that camp, you should wait to see what kind of performance is on offer by the GeForce RTX 4080, and of course, AMD's upcoming RDNA 3 series.

Shopping Shortcuts:

Further Testing

Since we published this day-one review of the GeForce RTX 4090, we have run additional benchmarks and comparisons you may be interested in: