FreeSync vs G-Sync – Which Is Best?

Both camps have put their best efforts to combat screen tearing, but which one did it better? Are there other options out there?

If you’re a PC gamer, you’ve almost certainly encountered or heard of screen tearing. This visual issue has tormented us since the performance of graphics cards began increasing rapidly, and monitors struggled to keep up. Both FreeSync and G-Sync have addressed this issue, but which is better? Read on to find out.

Before we dive in, we first need to understand the problem to find the best solution for it.

Table of ContentsShow

Screen Tearing

So what leads to screen tearing? Essentially, this problem arises when the GPU is trying to generate more frames than the monitor can show.

For example, a GPU producing 100 FPS would be effectively pointless on a 60Hz monitor, as the numbers don’t correspond to each other.

The monitor will try to display the 100 frames produced by the graphics card each second, but it will be unable to, as it’s only able to refresh itself 60 times per second.

Screen tearing
An example of screen tearing

This effect often occurs when images move horizontally, which frequently happens in side-scrolling games and first-person shooters, where it can be particularly annoying.

Screen tearing can instantly ruin the in-game experience by reminding you that you’re sitting in front of your computer and playing a game. This breaking of immersion has been observed and described by many people back when this issue was at its most prevalent.

Being immersed in a game should make you forget your surroundings. Screen tearing reminds gamers of glitches and makes them worry that there’s something wrong with their PC. This brief distraction is often enough to get you killed in-game.

No gamer likes it when they lose because of something beyond their control.

Enter VSync.

Vertical Synchronization (VSync)

vsync on vs off
VSync solves screen tearing artifacts at the added cost of input lag. Not great for competitive games.

VSync is a software solution that avoids screen tearing by limiting the GPU’s output to match the monitor’s refresh rate.

Great! Problem solved, right? Not exactly.

VSync tries to lock the images rendered by the GPU to the monitor’s refresh rate. What happens if the frame rate drops below that refresh rate limit, which is usually 60Hz?

As VSync can’t interact with the display, nothing can be done in that situation. Instead of screen tearing, there is a new problem: the monitor needs to wait for a new frame to render while leaving the old one on the screen, leading to some strange stuttering.

Both AMD and Nvidia initially developed a solution called Adaptive VSync, which influences performance by using its software. Adaptive VSync turns VSync off when the graphics card isn’t outputting enough frames to match the monitor’s refresh rate and enables it when the opposite is true.

adaptive sync
Image credit: Nvidia

Adaptive VSync was a relatively primitive solution by modern standards, but it worked. However, both the green and red teams felt that they could devise something better.

Let’s look at their solutions.

G-Sync

NVIDIA G Sync

In 2013, Nvidia was the first company to debut its solution: the GeForce GTX 650 Ti Boost card. With the lengthy gap between Nvidia’s and AMD’s technology being released, it might appear that Nvidia invented this technology, but that is not the case.

As mentioned previously, the idea existed long before Nvidia’s G-Sync and AMD’s FreeSync, but both companies used a different approach to it.

G-Sync is similar to VSync as it limits the extra frames being shown if the monitor can’t handle it. However, the hardware solution presented by Nvidia – available exclusively on their cards – is the major breakthrough here.

gsync on vs off
G-SYNC technology solves screen tearing. Image credit: Nvidia

Nvidia also made monitor manufacturers pay an extra fee to access the G-Sync module, which enables them to run the tech on their monitors.

This exclusive model enabled G-Sync to take control of the monitor’s refresh rate and the number of frames rendered by the GPU. For example, if the GPU renders 40 FPS, G-Sync will modify the monitor’s refresh rate to 40Hz to maintain perfect synchronization.

However, if the GPU pushes the game’s frame rate beyond the monitor’s refresh rate, G-Sync will not impact the FPS.

Before looking at the later incarnations of G-Sync, let’s take a look at AMD’s equivalent solution to compare the two.

FreeSync

FreeSync

Like G-Sync, AMD’s FreeSync is a step up from VSync and performs very similarly to Nvidia’s tech. In other words, it synchronizes the monitor’s refresh rate with the GPU’s output.

Despite sounding relatively similar and delivering essentially the same result, there is a reason why FreeSync is generally considered the preferred option.

As mentioned earlier, Nvidia makes manufacturers pay for the dedicated G-Sync module. Although a similar certification is required for monitors to be FreeSync compatible, the difference is that FreeSync, as its name suggests, is free.

Display makers don’t need to pay an additional fee to be FreeSync certified, meaning consumers are not charged extra either.

Some in the tech industry have equated this to highway robbery and called it a predatory practice because Nvidia was the first to market this solution roughly two years before, which is a long time in the tech world.

freesync off vs on
FreeSync technology solves screen tearing too. Image credit: AMD

People’s reservations toward Nvidia stem from the fact that they haven’t made G-Sync free, even after FreeSync was released in 2015. They are well within their rights to take this stance, and AMD’s free release of their technology was likely a marketing strategy to compete with Nvidia. Still, some concessions were expected after FreeSync hit the market.

A possible reason for keeping their proprietary module is their dominance of the GPU market, which allows them to maintain their price. It can be easy to throw shade at Nvidia, but there are legitimate reasons to pay for G-Sync.

We’re referring to Nvidia’s strict(er) certification rules for G-Sync. The issue with FreeSync is that its label can be applied more liberally, even when a monitor’s range is as low as 50-80Hz. This means refresh rate synchronization will only be available between 50 and 80 frames per second. Below or above that range, there’s no synchronization at all.

We’ll look at AMD’s solution for that but, notably, basic G-Sync requires that monitor manufacturers go virtually free-range with their G-Sync capabilities. This means that if your game is running at 20 FPS, there won’t be any screen tearing or stuttering. The same applies if you’re gaming at 200 FPS.

AMD is not without fault in this rivalry. The main issue is that they don’t require monitor manufacturers to declare where their range lies. They need to do it if it’s a flagship monitor so they can brag about it, but if they’re releasing a budget or even mid-range monitor, it’s entirely possible that you will be kept in the dark about the specific range.

Another advantage of FreeSync over G-Sync is that it can work with HDMI as well as DisplayPort, while Nvidia’s solution only works with DisplayPort.

Variations

Any good tech that has existed for a while is bound to have different incarnations which further improve its original design. Both AMD and Nvidia have released updates on their respective technologies. We’ll now look at them individually.

G-Sync Compatible

geforce g sync compatible monitors
The expanded list of G-Sync Compatible monitors. The current list is much bigger. Image credit: Nvidia

Let’s start with something that might change your perception of Nvidia. First announced at CES 2019, G-Sync Compatible offered to make any adaptive sync monitor (including those certified by FreeSync) able to run G-Sync, for no additional cost to either the manufacturer or the consumer.

When this was originally revealed, Nvidia stated that they had tested hundreds of monitors and only certified 12 of them. They have since added five more monitors to that list. However, this isn’t without its downsides.

Nvidia conveniently fails to mention that, while it’s technically possible to run G-Sync Compatible with any adaptive sync monitor, there are often issues involved. Random blanking or stuttering is enough to make hardware unusable, and it’s a likely occurrence if you try using it with a non-certified monitor.

FreeSync Premium

AMD FreeSync Premium

Don’t be discouraged by this name. AMD has stayed true to its free-of-charge policy.

Premium is undoubtedly an improvement when compared to the basic FreeSync, due to features such as the low framerate compensation (LFC), which fixes the range issue but only to a certain extent.

It does this by adjusting the monitor’s refresh rate if the GPU’s output dips below the monitor’s FreeSync range. This is done by compensating for the dip, meaning if the range is 40-80Hz and FPS drops to 30, FreeSync Premium will adjust the monitor’s refresh rate to 60Hz. This is not a perfect fix, but it has worked thus far.

Another way AMD seeks to improve the range issue is with Premium certification. They state that for a monitor to support LFC and FreeSync Premium, they require a range of 2.5 or higher. This means that when the upper limit of the range is divided by the lower limit, it must not be below 2.5.

G-Sync Ultimate

NVIDIA G Sync Ultimate

Released at the same time as G-Sync Compatible, at the 2019 CES, G-Sync Ultimate promised to add numerous new features to the already solid G-Sync tech.

One of the main talking points is its 1400 nits HDR. This enables G-Sync to score much higher in terms of lighting and provide an even crisper image.

Other capabilities of Ultimate include refresh rate overclocking, variable overdrive, ultra-low motion blur display modes, full matrix backlight, and DCI-P3 color.

The most surprising aspect of all is the innovative refresh rate overclocking. Overclocking CPUs and GPUs is old news but overclocking monitors is very useful as it could save you a lot of money.

FreeSync Premium Pro

FreeSync Premium Pro

FreeSync Premium further enhanced the original FreeSync formula from AMD. It retains its predecessor’s LFC capabilities and promises to deliver the optimal HDR experience on the AMD side.

However, its 400 nits promise was a great disappointment compared to the previously mentioned 1400 nits from the competitor’s G-Sync Ultimate.

Which One Is The Best?

Considering the various versions, it can’t be said outright which solution is superior. To help you decide, we’ll compare the different versions with their equivalent competition.

The bottom tier is between G-Sync Compatible and FreeSync. While Nvidia offered variations above and below the base G-Sync tech, AMD improved upon their original design, so these two are fairly comparable. Due to its availability and despite the lack of added cost on both sides, FreeSync is our choice here.

In the middle ground, we have FreeSync Premium and G-Sync. These are probably the most difficult options to choose between. While G-Sync is technically a better technology, FreeSync Premium has the advantage of being free. In this case, you should probably make a personal decision based on your budget.

At the top, we have G-Sync Ultimate and FreeSync Premium Pro, and there is a very clear winner. Although FreeSync Premium Pro is free and a good solution, G-Sync Ultimate is far superior tech and worth the additional cost.

Overall, if you’re getting a new monitor or GPU, in addition to being aware of each side’s refresh rate synchronization capabilities, you should remember that, depending on your budget, there is a solution out there for everyone.

Recommended Reads

multiple audio and video connectors
HDMI vs. DisplayPort vs. DVI vs. VGA vs. USB-C – Which To Use?
Aleksandar Cosic

Alex is a Computer Science student and a former game designer. That has enabled him to develop skills in critical thinking and fair analysis. As a CS student, Aleksandar has very in-depth technical knowledge about computers, and he also likes to stay current with new technologies.