AMD vs NVIDIA – Which GPU Manufacturer Should You Choose?

The war between two titans of the industry is about to re-heat. This AMD vs Nvidia issue might’ve been a resounding one-sided affair for a while, but now one side is catching up and is ready to challenge for the throne.

This GPU war has raged since the 1990s, and although AMD has a much longer history in tech, Nvidia has been the one that’s dominant and is financially in a much better spot, being worth roughly twice as much. Although, AMD does devote a chunk of their resources to their CPUs which is nothing to sneeze at either.

The history doesn’t matter as much, at least in the world of technology. Recent history maybe, but no one cares that AMD has roots in the 1950s.

This is champion versus challenger. Nvidia versus AMD. Now, let’s see which one is the best choice for you.

Table of ContentsShow

AMD vs Nvidia: Performance

amd vs nvidia

If you’re thinking about getting a new GPU, you’re probably having the potential performances of each card in mind when deciding. Hitting that 60 FPS seems like a bare minimum in today’s gaming world, and having a good GPU is the key to getting that performance.

Building a new PC and getting it to the best possible in-game performance isn’t all about getting the best GPU. It’s important to know that CPU and RAM are required to be on-par with the GPU to avoid bottlenecking.

There are three general GPU classifications and each represents a valid part of the market. These are low-end or budget, mid-tier or mid-range, and high-end. Seeing how each of these categories give different things to different people, it’s only fair to compare AMD and Nvidia for each.

Budget Cards

AMD Radeon GPU

For this category we can look at RX 5500 XT and GTX 1660 as they’re probably the best what AMD and Nvidia respectively have to offer in the $200 price range. They’re both good representations of their respective manufacturer’s flagship technologies (AMD’s RDNA and Nvidia’s Turing) and actually stack up quite well.

While RX 5500 XT offers a better base clock rate at 1685MHz compared to GTX 1660’s 1530MHz, Nvidia cleverly used this to their advantage and offered a better boost rate at 1785MHz which is higher than what AMD’s game rate at 1737MHz. Although this may look insignificant, and not a lot of people would actually be able to notice the difference, it’s interesting to showcase how this competition has reached even the minute details.

AMD continues to showcase their capabilities with an 8GB GDDR6 RAM which is categorically better than Nvidia’s 6GB GDDR5. It also holds firm with higher memory bandwidth and more L2 cache, but as you might’ve assumed already, Nvidia spends less power for GTX 1660.

However, the hardware is nothing without software and in this region it is Nvidia who reigns supreme. Despite the previously mentioned specifications favoring AMD, it is Nvidia that performs better and interestingly enough, it is AMD who uses less power.

But gamers don’t want their GPUs to spend less power, they want performance and this is the reason why Nvidia scores the first point here.

Mid-Range

For this category, we’ll look at AMD RX 5600 XT and Nvidia RTX 2060 as they look like two pretty good options and fair representations of their brand’s foray into mid-range.

rtx2060 vs rx5600xt

RX 5600 XT gave plenty of people a good reason to be excited about AMD’s return to the scene. Meanwhile, RTX 2060 feels like Nvidia might’ve gotten a little careless about the quality of their production.

Their specifications are virtually identical, with AMD boasting a slightly higher boost clock, but seeing how they’re so equally matched, it’s not very fair to use that as a definitive marker of better performance.

Why AMD may seem like a winner here is because of the sly business tactic they pulled – undercutting competitor’s prices. While these two cards perform similarly, with some games playing better on one and others on the other, AMD has positioned itself as a cheaper option. This disparity between quality and price left Nvidia scratching their heads on how to respond.

What AMD has struggled with are driver issues. It’s been reported that games have even outright crashed to the desktop in some instances or given black screens. However, as those issues can be fixed without a catastrophic product recall, this is a bright point for AMD vs Nvidia.

High-End

Here’s where things get a little tricky because it’s kind of hard to call anything AMD has to offer ‘high-end’, so this is a clear win for Nvidia. But, this is exactly the place where the GPU wars are expected to re-ignite. With AMD preparing to launch their RDNA 2 graphics cards before the end of 2020, Nvidia appears to finally have a challenger for their RTX 2080 Ti.

For the sake of transparency, let’s compare the best AMD has to offer right now with RTX 2080 Ti. It’s not a fair fight, but it’s safe to say that Nvidia’s representative easily outperforms AMD’s RX 5700 XT or even Radeon VII, whichever you consider a better opponent.

As the world continuously evolves, so does the sphere of technology and as AMD announces the competition for the best of what Nvidia has to offer, so does Nvidia promises of even better GPUs. Although Nvidia easily takes a point here, this particular area of competition will be an intriguing one to keep an eye out for.

Total score: AMD 1 – Nvidia 2

AMD vs Nvidia: Features

While features may seem less significant than actual specs, they are a very important part of what makes a good GPU. For the large part, there are GPUs on both sides with similar hardware and price, but the devil’s in the details, or rather, in this case, devil’s in the features.

Ray Tracing

The obvious sticking point that has to be talked about. It may be unnecessary to say, but ray tracing isn’t a requirement for GPU performance, but that being said, it’s clear that ray tracing offers better and more realistic visuals.

So, what is ray tracing anyway?

Without getting too technical, ray tracing is a rendering technique that allows for lighting to be tracked more accurately by accounting for stuff like object materials and how lighting reflects off of them.

rtxon rtxoff

The main reason why we’re even talking about ray tracing is the fact that AMD architecture still doesn’t support it, and that is a big issue.

Nvidia has pursued ray tracing technology since the 2000s and have introduced it to the world in 2018. This particular move particular highlighted their dominance in the GPU market and AMD is yet to recover. The good news for the red team is that they’re going to be producing the graphics chips for Playstation 5 and Xbox Series X.

Since the next generation of gaming consoles wouldn’t have allowed their names to be associated with something that’s not top-of-the-line, AMD has been very upfront and honest about their intention to introduce ray tracing to their next-generation GPUs.

While ray tracing coming to the AMD side of things is all well and good, the point here goes to the clear market innovator Nvidia.

Variable Rate Shading

VRS is a technology first brought to the market by Nvidia and has found the best use in VR. What it does is calculate which frames in your field of vision are going to be fully shaded or even rendered at all. This significantly lowers the power load of the GPU and allows it to spend that extra energy on other, more useful things.

variable rate shading
Biggest shading focus is on what player looks at the most.

AMD still hasn’t incorporated this tech into their GPUs, but it’s heavily rumored that it’s coming to their RDNA 2 line as they filed for the patent for VRS all the way back in early 2019.

There have also been talks of perfecting the eye-tracking technology and using that to further improve upon VRS and that sounds seriously sci-fi.

Since this cool piece of tech is so far an Nvidia exclusive, they earn a point here.

Deep Learning Super Sampling

Designed as another way to increase the efficiency of the GPU, DLSS is an incredible piece of technology, although a little ahead of its time. The reason for that assertion lays in the process required to fully enjoy the benefits of DLSS.

The biggest issue here is that the game developers have to enable DLSS support when making the game and in order for the player to see the improvement, it needs to be sent to Nvidia who then let an AI run through the game, analyze the images, and automatically upscale it to a higher resolution.

On paper, the idea sounds really cool, but so far the execution keeps it at a more of a novelty level. While AMD hasn’t exactly responded to this, so far it doesn’t feel necessary for them to do so and as Nvidia clearly has here something that AMD doesn’t, it wouldn’t be fair to award them a point.

G-Sync vs FreeSync

G Sync vs FreeSync

These are Nvidia’s and AMD’s adaptive synchronization technologies designed to eliminate screen tearing during gameplay. The screen tearing occurs when the GPU’s output is mismatched with the display’s refresh rate.

The communication between the GPU and the monitor works in a way where monitor refreshes at 60Hz and requires 60 frames to be sent from the GPU (same for 120Hz, 144Hz, and so on). The issue usually happens when the GPU is unable to produce the required frames and this causes the screen tearing.

Adaptive sync technology allows for the GPU to effectively change the rate at which monitor refreshes depending on the number of frames it produces. So, if the game dips to 40 FPS, the GPU will limit the monitor to refresh only at 40Hz. This doesn’t make the games run any smoother though, it just prevents screen tearing.

In the not so distant past, the solution for this was software-based, most notably VSync, but this is being phased out in favor of newer technologies.

G-Sync is Nvidia’s solution to screen tearing and has understandably drawn some criticism. Because they were first to market with adaptive sync tech, they sought to use that to their advantage with some hardware requirements. In order to use this technology, monitors must be G-Sync compatible and although this isn’t specifically stated, it made them add anywhere from $100 to $300 in price.

G SYNC HDR

In order to run G-Sync, monitors require proprietary Nvidia G-Sync scaler module which means they’ll all have similar on-screen menus and options. This is where AMD’s FreeSync has looked to shine. They allow manufacturers to choose which scaler they want to use in order to run FreeSync.

FreeSync uses Adaptive Sync standard built into the DisplayPort 1.2a specification which enables manufacturers to find cheaper options for the scaler. Cheaper, not free as the name suggests.

One big advantage G-Sync has over FreeSync is the way it handles GPU outproducing the display adapter. It will actually lock the GPU’s frame rate upper limit to that of the monitor, while FreeSync, on the condition of in-game VSync being turned off, will allow the GPU to produce extra frames. This can lead to screen tearing but lower the input lag.

Where this AMD vs Nvidia, FreeSync vs G-Sync issue comes to a strong division in the community is the fact that not all Nvidia cards will work with FreeSync monitors and likewise, not all AMD cards will work with G-Sync monitors. This issue is being ironed out, but the fact still remains that you will have to check if the monitor you’re getting will work properly with your GPU and vice versa.

While both sides have their pros and cons in the adaptive sync technology. the fact that FreeSync is more readily available is what ultimately earns AMD the point here.

Total score: AMD 2 – Nvidia 4

AMD vs Nvidia: Drivers And Software

NVIDIA and AMD

Good hardware needs good software, that’s just how things are supposed to work. Drivers are programs that control how a certain device (like GPU) interfaces with the CPU. It enables software to use the hardware part it controls to the best of its ability without having to know and control every aspect of how that particular part operates.

Needless to say, this is a very important part of the hardware-software dynamic and it’s also a very interesting topic for both AMD and Nvidia.

As previously mentioned, AMD sorta shot themselves in the foot when their RX 5000 series launched to some driver issues creating black screens and crashes. Unfortunately, this issue persists despite newer drivers coming out and allegedly fixing the issue.

Nvidia haven’t exactly covered themselves in glory either as their issues can often be slighter and therefore harder to identify precisely.

AMD definitely appears to have made significant strides in improving their driver capabilities with their yearly Radeon Adrenalin updates. The 2020 version alone claims to make an impressive 12% improvement over the 2019 version. Another good thing for AMD is that they made a conscious effort to simplify things and only use one piece of software for updating the drivers and have followed a once-per-month schedule or even more often if there’s a major release.

AMD Radeon Software Adrenalin Edition

The biggest detractor for AMD are the persistent issues which take way too long to properly fix.

Nvidia has largely followed suit with their driver update schedule, but the big difference is that they utilize two different applications for the control over their hardware. Their Nvidia Control Panel allows you to configure stuff like 3D settings or display resolution, while GeForce Experience handles game optimizations, driver updates, and extra features. The biggest downside GeForce Experience is that you have to log in and solve a captcha prompt in order to adjust the settings to your liking.

In the end, while AMD has it’s downsides, the efficient simplicity at which their software operates, earns them a point in this category.

Total score: AMD 3 – Nvidia 4

AMD vs Nvidia: Power Consumption And Efficiency

electric lighting effect

When AMD introduced Navi and announced their gamble at TSMC’s 7nm FinFET process, they likely thought that this amazing 50% per watt performance will bridge the efficiency gap. However, they were so far behind that not even that helped. It’s interesting that Navi cannot even outperform older Nvidia GPUs that were built on the TSMC’s last-gen 12nm node.

Where the future seems brighter is the much-hyped Big Navi, coming in late 2020, that’ll supposedly improve performance by another 50%. Although that sounds almost too good, many are keen on how and if they’ll pull it off. But where the issue lies is that even by those predictions, Big Navi will compete with Nvidia’s Turing architecture when it should be challenging Nvidia’s Ampere.

The issue isn’t all black and white however. In the extreme performance range, Nvidia’s RTX 2080 Ti certainly uses a lot of power while AMD doesn’t really even have a competing card that can be compared. If we simplify things, we could say that RX 5700 XT is not as power-draining as RTX 2080 Ti, but that would be completely disregarding the core of the argument where the latter is a miles better GPU.

NVIDIA GPU

In the medium range, AMD has a reason to be optimistic because their RX 5700 and RX 5600 XT perform better than RTX 2060 while using less power. Performance-wise, they’re both on par with RTX 2060 Super but have retained the efficiency advantage.

The small difference can be observed in the budget category as well. While Nvidia’s GTX 1660 Ti and GTX 1660 Super not only outperform AMD’s RX 5500 XT by a whopping 20% but also consume less power. Even GTX 1660 has RX 5500 XT’s number in both efficiency and performance, albeit by a slimmer margin.

Nvidia edges out AMD in both budget and high-end classifications, while AMD is only slightly better in the mid-range. What should really be concerning AMD is that Nvidia was able to be more efficient despite using previous-generation lithography.

Overall, this is one of the areas where Nvidia clearly dominated in years prior to the release of Navi. Then they were better but by a finer margin. While this is an easy point for Nvidia, it’s for sure going to be interesting how AMD will oppose them with Big Navi.

Total score: AMD 3 – Nvidia 5

AMD vs Nvidia: Dollar Value

AMD Radeon Vega

While top-level performance is what most gamers are looking out of their GPUs, we still have to be conscious of the price tags. As previously discussed, there are three basic categories for both price and performance. They are well-named and explained earlier, so we’ll dive right into it.

Nvidia has a clear advantage in the extreme price range as there’s simply no contending AMD GPUs. Now that that’s out of the way, let’s proceed to the more nuanced comparisons.

AMD’s most expensive GPU is RX 5700 XT at $400 and its price match from Nvidia is RTX 2060 Super and that is not a fight that Nvidia can win, performance-wise. Also, their RTX 2070 Super only barely outperforms RX 5700 XT while costing a lot more (5% better performance at 25% more money. In terms of pure dollar value, this is a point for AMD.

The mid-range is another interesting category. Although RTX 2060 supports ray tracing, RX 5600 XT is still cheaper and performs better overall. Not to mention comparison between RX 5600 XT and GTX 1660 Ti where AMD completely blows Nvidia out of the water. Point AMD.

If you’re on a tighter budget, there’s more uncertainty in that category. GTX 1650 Super probably gets you the best bang for your buck here, narrowly edging out the RX 5500 XT 4GB. But, if you’re willing to spend a little bit more, AMD is probably the better option with RX 5500 XT 8GB that doesn’t actually outperform GTX 1660 GDDR5 but costs a whole 10% less. It’s a tie in the budget class.

And The Winner Is…

AMD and NVIDIA GPUs

With a total score of 6 to 5, the current better manufacturer is Nvidia. Of course, this AMD vs Nvidia issue isn’t all black and white and that doesn’t mean you should blindly get an Nvidia card. The general advice for making any sort of investment in a PC part is to correctly assess your needs and research what fits the best under your budget.

If you’re suffering from the need to always have the best of the best or you don’t need to always have 120+ FPS, then AMD is a safer bet. However, Nvidia offers a lot better graphics and that is worth the extra price.

You Might Like These Too

Aleksandar Cosic

Alex is a Computer Science student and a former game designer. That has enabled him to develop skills in critical thinking and fair analysis. As a CS student, Aleksandar has very in-depth technical knowledge about computers, and he also likes to stay current with new technologies.