FreeSync vs G-Sync – Which Is Best?

Ever since graphics cards rapidly increased their performance and monitors couldn’t keep up anymore, we’ve been plagued with the effect known as screen tearing. Both AMD and Nvidia have tackled this issue on their own, but which one is better?

But first, we need to understand the crux of the problem to best understand what’s the best solution for it.

Table of ContentsShow

Screen Tearing

This issue occurs when the GPU is rendering more frames than the monitor can display.

For example, a GPU producing 100 FPS would be nearly useless on a 60Hz monitor as the numbers correspond to one another. So the monitor will try to fit in those 100 frames produced in a second by the graphics card which it technically can’t as it can only refresh itself 60 times in that same second.

Screen tearing
An example of a screen tearing artifact

This most commonly occurs when the visuals are horizontally moving, like in side-scrolling games, or more annoyingly in first-person shooters. The problem with screen tearing is that it immediately knocks you out of the in-game experience and reminds you that you’re a person sitting in front of the computer and playing a game.

This isn’t an abstract thought, it’s a legitimate observation that a lot of people echoed in the days when this issue was prevalent.

Being immersed in the game makes you forget your surroundings and screen tearing reminds people of glitches and makes them instantly wonder if there’s something wrong with their PC and if it’s about to totally crash – a small distraction that’s just enough for getting killed in-game, and losing for something that’s out of your control has to be one of the worst feelings in the world.

Enter VSync.

Vertical Synchronization (VSync)

vsync on vs off
VSync solves screen tearing artifacts at the added cost of input lag. Not great for competitive games.

VSync is a software solution to screen tearing and what it does is limit GPU’s output to 60 FPS.

Problem solved, right? Not really.

What VSync actually does is try to lock the images rendered by the GPU to 60 a second, but what happens if the framerate drops below 60 FPS?

Well, since VSync can’t interact with the display, there’s nothing that can be done in that situation, as bad as it sounds. Screen tearing won’t happen, but we’ll have a different problem in that the monitor will wait for a new frame to render while keeping the old one on the screen which leads to some awkward stuttering.

Both AMD and Nvidia initially developed a solution called Adaptive VSync which still influences the performance from a software point of view. Simply, it turns VSync off when the graphics card isn’t outputting enough frames to match the monitor’s refresh rate and enables it when the opposite is true.

adaptive sync
Image credit: Nvidia

Adaptive VSync was a primitive brute-force solution, but it worked. However, both the green and the red team felt like they could do it better on their own, and truthfully, they did.

Let’s see how they did it.

G-Sync

NVIDIA G Sync

First comes Nvidia’s solution which made its debut in 2013 with their GeForce GTX 650 Ti Boost card. Due to a large difference between Nvidia’s and AMD’s technology getting released, many have speculated that Nvidia innovated the technology, but that is simply not the case.

As mentioned before, the idea existed long before either G-Sync or AMD’s FreeSync, it’s just that both companies modified the approach.

G-Sync is very similar to VSync in that it’ll limit the extra frames being shown if the monitor can’t handle it. However, where Nvidia made the breakthrough is a hardware solution. Available only on their cards, naturally.

gsync on vs off
G-SYNC technology solves screen tearing. Image credit: Nvidia

Nvidia also made monitor manufacturers pay extra in order to have access to the G-Sync module which enables them to run the tech on their monitors.

This exclusive model enabled G-Sync to have control over the monitor’s refresh rate as well as the number of frames being rendered by the GPU. So, if you have a 60Hz monitor and a GPU that’s capable of running the game at 80 or 90 FPS, G-Sync will limit the card to 60 FPS. Likewise, if the GPU is rendering 40 FPS, G-Sync will modify the monitor’s refresh rate to 40Hz in order to achieve that perfect synchronization.

Before getting into additional incarnations of G-Sync, it’s best we take a look at what AMD has to offer in order to have a fairer comparison.

FreeSync

FreeSync

Much like G-Sync, AMD’s FreeSync is a step up from a previous solution VSync, and it also pretty much does the same thing as Nvidia’s tech in that it synchronizes the monitor’s refresh rate with GPU’s output.

Despite sounding relatively the same and actually delivering pretty much the same result, there is an important reason why FreeSync is considered as a better option.

We said earlier that Nvidia makes manufacturers pay for G-Sync certification and although a similar certification is required for monitors to be FreeSync compatible, the difference is that FreeSync, as the name suggests, is free.

That’s right, the display makers won’t have to pay extra to get FreeSync certified and therefore won’t be passing down the cost to the consumer.

The reason for what people in the tech industry have likened to a highway robbery and called a predatory practice by Nvidia is that they were first to market by roughly two years which in the tech world is akin to a decade.

freesync off vs on
FreeSync technology solves screen tearing too. Image credit: AMD

What makes people really detest Nvidia here is that since FreeSync come out in 2015, they haven’t made G-Sync free as well. Of course, they are well within their rights to do so and make no mistake about AMD purposely making their technology free to mess with Nvidia, but some concessions were expected when FreeSync hit the market.

A possible reason for keeping their proprietary module is their dominance in the GPU market which allows them to maintain that price point.

But, this is not an Nvidia bashing party, there are legitimate reasons why G-Sync deserves your money. Of course, not mentioning that FreeSync is likely only free as a direct shot at G-Sync would be dishonest, but we’re not talking about that here.

We’re talking about the strict(er) certification rules that Nvidia has for G-Sync. FreeSync’s issue is that its label can be applied more liberally, even in situations where a monitor’s range for it is as low as 50-80Hz. This means that refresh rate synchronization will only be available between 50 and 80 frames per second. Below or above that, it’s like there’s no synchronization at all.

We’ll get to AMD’s solution for that in a bit, but it’s important that basic G-Sync requires the monitor manufacturers to go virtually free-range with their G-Sync capabilities, meaning that if your game is running at 20 FPS, there won’t be screen tearing or stuttering. The same applies if you’re gaming 200 FPS.

Not that AMD is clean in this competition either. The biggest problem is that they don’t require monitor manufacturers to declare where their range is. They will do it if it’s a flagship monitor and they can brag about it, but if they’re releasing a budget or even mid-range monitor, there’s a possibility that you’ll have no idea about the actual range.

One thing that FreeSync has over G-Sync is that they can work over HDMI as well as with DisplayPort while Nvidia’s solution only works with DisplayPort.

Variations

Like any older tech, there are bound to be some different incarnations to further improve on the original design. Both AMD and Nvidia have released variations of their respective technologies and they do add to the discussion, so we’ll go over them one by one.

G-Sync Compatible

geforce g sync compatible monitors
The expanded list of G-Sync Compatible monitors. The current list is much bigger. Image credit: Nvidia

Let’s start with something that may change your perception of Nvidia. First announced at CES 2019, G-Sync Compatible offers to make any adaptive sync monitor (including those FreeSync certified) be able to run G-Sync, for no added cost to either the manufacturer or the consumer.

When it was originally revealed, Nvidia stated that they have tested “hundreds” of monitors and only deemed 12 certifiable. Since then, they continued their testing and add another 5 monitors to that list. However, this doesn’t come without its drawbacks. Namely, a certified G-Sync Compatible monitor will guarantee that G-Sync will run properly and that makes sense, right?

What Nvidia conveniently doesn’t talk about is that while technically it is possible to run G-Sync Compatible with any adaptive sync monitor, more often than not there will be huge issues. Stuff like random blanking or stuttering is enough to make hardware unusable and that can very much happen if you try Compatible with a monitor that’s not among those certified.

FreeSync Premium

AMD FreeSync Premium

You shouldn’t by any chance confuse the naming as something that has an added cost. AMD has remained loyal to their “free” idea.

Premium does bring an improvement over the basic FreeSync such as the low framerate compensation (LFC) which fixes their range issue, but only to a certain degree.

What it does is adjust the monitor’s refresh rate if the GPU’s output dips below the monitor’s FreeSync range. It does so in an appropriate manner, meaning that if the range is 40-80Hz and FPS drops to say 30. FreeSync Premium will adjust the monitor’s refresh rate to 60Hz. It’s sort of a duct tape fix, but so far it works.

Another place where AMD seeks to improve upon the range issue is with Premium certification. They state that in order for a monitor to be able to support LFC and thus FreeSync Premium they have to have a 2.5 or more range. This means that when the upper limit of the range is divided with the lower limit, it must not come below 2.5.

G-Sync Ultimate

NVIDIA G Sync Ultimate

Revealed at the same time as G-Sync Compatible, at CES 2019, Ultimate promised to add a ton of new features to an already solid G-Sync tech.

One of the main talking points is HDR with a ridiculous 1400 nits (well, up to 1400). This number enables G-Sync to be much better at lighting and prove an extra crispy image.

Other capabilities of Ultimate include refresh rate overclocking, variable overdrive, ultra-low motion blur display modes, full matrix backlight, and DCI-P3 color.

The one that draws the most attention is the refresh rate overclocking, simply because of how novel the concept appears. We’re used to overclocking CPUs and GPUs, but monitors? It is possible and actually quite useful as it can save you some money.

FreeSync Premium Pro

FreeSync Premium Pro

Another improvement on the original FreeSync formula from AMD further improves on FreeSync Premium.

It retains its predecessor’s LFC capabilities and promises to deliver the best HDR experience from the AMD side of things.

However, where it miserably fails short is with its 400 nits promise which is absolutely dominated by previously mentioned 1400 nits from their G-Sync Ultimate.

Which One Is The Best?

Between their various versions, it can’t be outright said which one is superior while remaining reputable so we’ll compare these different versions with their appropriate competition.

The lowest tier here will be G-Sync Compatible and FreeSync. So, while Nvidia offered variations above and below the base G-Sync tech, AMD improved upon their original design so these two are relatively fairly comparable. Due to much larger availability and despite the lack of added cost on both sides, FreeSync is our choice here.

In the middle ground, we have FreeSync Premium and G-Sync, and this is probably the hardest place to give a concrete answer. While G-Sync is technically a better technology, FreeSync Premium comes without added cost. In this case, it’ll probably be for the best that you go with your budgetary capabilities.

And at the top, there is G-Sync Ultimate and FreeSync Premium Pro, and here there are just barely enough reasons to even have a discussion. Although FreeSync Premium Pro is a great solution and is free, G-Sync Ultimate is a far superior tech and is well worth the added price.

Overall, when getting a new monitor or a new GPU, you should be aware of each side’s refresh rate synchronization capabilities, but ultimately know that depending on your budget, there are better and worse options out there and you should choose carefully.

You Might Like These Too

Aleksandar Cosic

Alex is a Computer Science student and a former game designer. That has enabled him to develop skills in critical thinking and fair analysis. As a CS student, Aleksandar has very in-depth technical knowledge about computers, and he also likes to stay current with new technologies.