The war between the two titans of the industry has just been re-kindled. The AMD vs Nvidia debate might’ve been a one-sided affair for a while, but now one of them is catching up and is ready to make a claim for the throne.
This GPU war goes back to the 1990s. Although AMD has a much longer history in tech, Nvidia has recently dominated the field and is financially in a much better spot, being worth around twice as much. However, AMD devotes a great part of their resources to their CPUs and this shouldn’t be overlooked either.
However, history doesn’t weigh that much in the world of technology: no one really cares that AMD has been around since the 1950s. This has now become a Nvidia versus AMD situation. Scroll down to find the best choice for you.
Table of ContentsShow
AMD vs Nvidia: Performance
If you’re thinking about getting a new GPU, you’re probably wondering about the potential performances of each card. Hitting that 60 FPS seems like a bare minimum in today’s gaming world, and having a good GPU is the key to getting that performance.
Nonetheless, building a new PC and getting it to the best possible in-game performance isn’t just about getting the best GPU. It’s also important to know that CPU and RAM are required to be on-par with the GPU to avoid bottlenecking.
There are three general GPU classifications and each represents a valid part of the market. These are low-end or budget, mid-tier or mid-range, and high-end. Seeing how each of these categories is beneficial in different ways, it’s only fair to compare AMD and Nvidia for each.
For this category, we can look at RX 5500 XT and GTX 1660 as they’re probably the best budget cards AMD and Nvidia have to offer in the $200 price range. They’re both good representations of their respective manufacturer’s flagship technologies (AMD’s RDNA and Nvidia’s Turing) and actually stack up quite well.
While RX 5500 XT offers a better base clock rate at 1685MHz compared to GTX 1660’s 1530MHz, Nvidia cleverly used this to their advantage and offered a better boost rate at 1785MHz, which is higher than AMD’s game rate, at 1737MHz. Although this may look irrelevant, it’s interesting to see how this competition has reached even the smallest details.
AMD further showcases its capabilities with an 8GB GDDR6 RAM, which is categorically better than Nvidia’s 6GB GDDR5. It also holds firm with higher memory bandwidth and more L2 cache, but as you might’ve assumed already, Nvidia spends less power for GTX 1660.
However, the hardware is nothing without software, and in this regard Nvidia reigns supreme. Despite the previously mentioned specifications favoring AMD, and although it consumes less power, Nvidia ultimately performs better.
As gamers are not as interested in power consumption as in performance, Nvidia is the winner in this category.
For this category, we’ll look at AMD’s RX 5600 XT and Nvidia’s RTX 2060 as they are two great choices and fair representations of the companies’ foray into mid-range.
This section will soon be updated as we await the releases of RTX 3070 and AMD’s equivalent.
RX 5600 XT gave people a good reason to be excited about AMD’s return to the scene. Meanwhile, RTX 2060 gives the impression that Nvidia might’ve gotten a little careless about the quality of their production.
Their specifications are nearly identical. AMD offers a slightly higher boost clock, but as they’re so equally matched, it wouldn’t be fair to use that as a definitive marker of better performance.
The reason why AMD seems to win here is because of their business tactic of undercutting competitor’s prices. While these two cards perform similarly, AMD has positioned itself as a cheaper option. This disparity between quality and price left Nvidia scratching their heads over how to respond.
One thing AMD has struggled with is driver issues. There have been reports of blank screens or of games outright crashing. However, as those issues can be fixed without a catastrophic product recall, this is a bright point for AMD.
Here’s where things get a little tricky. As AMD doesn’t really offer high-end products, it’s a clear win for Nvidia. However, this is exactly the point where the GPU wars are expected to re-ignite. With AMD preparing to launch their RDNA 2 graphics cards before the end of 2020, Nvidia appears to finally have a challenger for their RTX 2080 Ti.
For the sake of transparency, let’s compare AMD’s latest offer to RTX 3090. It’s not a fair fight, but it’s safe to say that Nvidia’s representative easily outperforms AMD’s RX 5700 XT or even Radeon VII, whichever you consider a better opponent.
As the world continuously evolves, so does technology. AMD has already announced their competition for Nvidia’s best products. And although Nvidia has also promised even better GPUs, this particular area of the competition will be a curious one to keep an eye out for.
Total score: AMD 1 – Nvidia 2
AMD vs Nvidia: Features
While features may seem less significant than actual specs, they are an important part of what makes a good GPU. Both AMD and Nvidia offer similar GPUs in terms of hardware and price, but the devil is in the details; in this case, in the features.
If something has to be talked about, it’s this. Although ray tracing isn’t a requirement of GPU performance, it makes a clear difference, offering better and more realistic visuals.
So, what is ray tracing anyway?
Technical definitions aside, ray tracing is a rendering technique that allows for lighting to be tracked more accurately by accounting for things such as object materials and how lighting reflects off of them.
Ray tracing immediately awards a point to Nvidia as they were the first to implement it in their GPUs. With the arrival of Big Navi, AMD will have a chance to prove themselves in this regard. However, it wouldn’t be fair to discuss this as we haven’t seen reliable benchmarks from AMD with ray tracing turned on.
Nvidia has pursued ray tracing technology since the 2000s, having introduced it to the world in 2018. This move particularly highlighted their dominance in the GPU market and AMD is yet to recover from it. If you’re rooting for AMD, the good news is they’re going to be producing the graphics chips for Playstation 5 and Xbox Series X.
Since the next generation of gaming consoles wouldn’t have allowed their names to be linked with something less than top-of-the-line, AMD has been very upfront about their intention to introduce ray tracing in their next GPUs.
While things do look promising for AMD with the introduction of ray tracing, the point here goes to the clear market innovator Nvidia.
Variable Rate Shading
VRS is a technology first brought to the market by Nvidia and has found the best use in VR. It basically calculates which frames in your field of vision are going to be fully shaded or even rendered at all. This significantly lowers the power load of the GPU, transferring that extra energy to more useful things.
AMD still hasn’t incorporated this tech into their GPUs, but it’s heavily rumored that it will be introduced in their RDNA 2 line, as it filed for the patent for VRS all the way back in early 2019.
There have also been talks of perfecting the eye-tracking technology and using that to further improve upon VRS, which totally sounds like sci-fi stuff.
Since we were able to see this cool piece of technology in action from Nvidia’s side, it deserves this point too.
Deep Learning Super Sampling
Designed as another way to increase the efficiency of the GPU, DLSS is a path-breaking piece of technology. It can even be considered a little ahead of its time because of the process required to fully enjoy its benefits.
The biggest issue here is that game developers are required to enable DLSS support when making the game, and in order for the player to see the improvement, it needs to be sent to Nvidia, who then let an AI run through the game, analyze the images, and automatically upscale it to a higher resolution.
Initially, the fact that Nvidia does the heavy lifting was one of the biggest drawbacks of DLSS. The whole process wasn’t as optimized as it needed to be and this largely caused DLSS to come off as an idea rather than an efficient concept.
With the arrival of Nvidia’s Ampere architecture, we know that this process was streamlined and we certainly expect future improvements.
G-Sync vs FreeSync
These are Nvidia’s and AMD’s adaptive synchronization technologies designed to eliminate screen tearing during gameplay. Screen tearing occurs when the GPU’s output is mismatched with the display’s refresh rate.
The communication between the GPU and the monitor basically works this way: if the monitor refreshes at 60Hz, it requires 60 frames to be sent from the GPU (same for 120Hz, 144Hz, and so on). The issue usually happens when the GPU is unable to produce the required frames, causing screen tearing.
Adaptive sync technology allows for the GPU to effectively change the monitor refresh rate, depending on the number of frames it produces. So, if the game dips to 40 FPS, the GPU will limit the monitor to refresh only at 40Hz. However, this doesn’t make the games run any smoother, it just prevents screen tearing.
In the not so distant past, the solution for this was software-based, most notably with VSync, but this is being phased out in favor of newer technologies.
G-Sync is Nvidia’s solution to screen tearing and has drawn some criticism. Having been the first to market with adaptive sync tech, Nvidia used that to their advantage with some hardware requirements: monitors must be G-Sync compatible and, although not specifically stated, this had an extra cost of anywhere from $100 to $300 in price.
To run G-Sync, monitors require a proprietary Nvidia G-Sync scaler module which means they’ll all have similar on-screen menus and options. This is AMD’s FreeSync’s biggest advantage: using Adaptive Sync standard built into the DisplayPort 1.2a specification, it allows manufacturers to choose any cheaper scaler.
Nevertheless, G-Sync has a better way to handle the GPU outproducing the display adapter. It will actually lock the GPU’s frame rate upper limit to that of the monitor, while FreeSync, on the condition of in-game VSync being turned off, will allow the GPU to produce extra frames. This can lead to screen tearing but lower the input lag.
The biggest issue which strongly divides the community is the fact that not all Nvidia cards work with FreeSync monitors, just like not all AMD cards will work with G-Sync monitors. This is being ironed out, but the fact still remains that you will have to check if your monitor will work properly with your GPU.
While both sides have their pros and cons in the adaptive sync technology sector, the fact that FreeSync is more readily available is what ultimately earns AMD the point here.
Total score: AMD 2 – Nvidia 4
AMD vs Nvidia: Drivers And Software
The fact of the matter is: good hardware requires good software. Drivers are programs that control how a certain device (like a GPU) interfaces with the CPU. It enables the software to use the hardware part it controls to the best of its ability without having to control every aspect of how that particular part operates.
As previously mentioned, AMD’s RX 5000 series failed miserably when it launched to some driver issues creating black screens and crashes. Unfortunately, this issue persists despite newer drivers’ constant attempts to fix it. Nvidia’s issues are equally problematic, as they are often slighter and therefore more difficult to identify with precision.
AMD has significantly improved their driver capabilities with their yearly Radeon Adrenalin updates. The 2020 version alone allegedly offers an impressive 12% improvement over the 2019 version. AMD also makes a conscious effort to simplify things and only use one piece of software to update its drivers. They have also followed a schedule of at least once-per-month with major releases.
The biggest setback for AMD is their products’ persistent issues, which take a long time to fix.
In turn, Nvidia’s driver update schedule consists of two diverse applications for the control over their hardware. Their Nvidia Control Panel enables the configuration of things like 3D settings or display resolution, and GeForce Experience handles game optimizations, driver updates, and extra features, but you’re required to log in and solve a captcha prompt to alter the settings.
In the end, while AMD has it’s downsides, the efficient simplicity at which their software operates earns them a point in this category.
Total score: AMD 3 – Nvidia 4
AMD vs Nvidia: Power Consumption and Efficiency
When AMD introduced Navi and announced their gamble at TSMC’s 7nm FinFET process, they probably thought this amazing 50% per watt performance would bridge the efficiency gap. But that was not the case: Navi didn’t even outperform older Nvidia GPUs that were built on the TSMC’s last-gen 12nm node.
The future seems brighter with the announcement of the much-hyped Big Navi. Coming in late 2020, it will supposedly improve performance by another 50%. The issue is that, even by those predictions, Big Navi will compete with Nvidia’s Turing architecture when it should be challenging Nvidia’s Ampere.
But this isn’t all black and white. In the extreme performance range, Nvidia’s RTX 2080 Ti certainly uses a lot of power while none of AMD’s cards even come close. In short, we could say that RX 5700 XT is not as power-draining as RTX 2080 Ti, but that would completely disregard the core of the argument where the latter is a miles better GPU.
In the medium range, AMD should be optimistic because their RX 5700 and RX 5600 XT perform better than RTX 2060, while using less power. Performance-wise, they’re both on par with RTX 2060 Super but have retained the efficiency advantage.
The small difference can be observed in terms of budget. While Nvidia’s GTX 1660 Ti and GTX 1660 Super not only outperform AMD’s RX 5500 XT by 20%, they also consume less power. Even GTX 1660 has RX 5500 XT’s number in both efficiency and performance, even if by a slim margin.
Nvidia edges out AMD in both the budget and high-end categories, while AMD is only slightly better in the mid-range. What should really be concerning AMD is Nvidia’s efficiency in using previous-generation lithography. Overall, this is one of the areas where Nvidia clearly dominated prior to the release of Navi.
Total score: AMD 3 – Nvidia 5
AMD vs Nvidia: Dollar Value
While top-level performance is what most gamers are looking out for their GPUs, price also needs to be considered. As previously discussed, there are three basic categories for both price and performance.
Nvidia has a clear advantage in the extreme price range as there’s simply no contending AMD’s GPUs. Now that this is out of the way, let’s proceed to the more nuanced comparisons.
AMD’s most expensive GPU is the RX 5700 XT at $400, and its price match from Nvidia is the RTX 2060 Super – this is not a fight that Nvidia can win, performance-wise. Also, their RTX 2070 Super only barely outperforms RX 5700 XT and is a lot more expensive (5% better performance at 25% more money). In terms of pure dollar value, this is a point for AMD.
The mid-range is another interesting category. Although RTX 2060 supports ray tracing, RX 5600 XT is still cheaper and performs better overall. Not to mention the comparison between RX 5600 XT and GTX 1660 Ti, where AMD clearly wins.
If you’re on a tighter budget, GTX 1650 Super is probably the best choice here, narrowly edging out the RX 5500 XT 4GB. If you’re willing to spend a little bit more, RX 5500 XT 8GB is probably a better option; while it doesn’t outperform GTX 1660 GDDR5, it costs a whole 10% less. We’ll call it a tie.
And The Winner is…
With a total score of 6 to 5, the better manufacturer is currently Nvidia. Of course, the AMD vs Nvidia debate is subjective and you shouldn’t blindly get an Nvidia card. The general advice for making any sort of investment in a PC part is to know your needs and budget.
If you don’t feel the need to always purchase the latest trend, then AMD can be a safer bet. However, Nvidia’s better graphics are definitely worth the extra price.