If you’re into gaming, you’ve almost surely heard of screen tearing. We’ve been tormented with this effect ever since the performance of graphics cards started increasing exponentially and monitors were unable to keep up. Both FreeSync and G-Sync have tackled this issue, but which one is better? Keep reading to find out.
Before we dive in, we first need to understand the essence of the problem to find the best solution for it.
Table of ContentsShow
So what causes screen tearing? Basically, this issue happens when the GPU is attempting to render more frames than those that the monitor can display.
For example, a GPU producing 100 FPS would be nearly useless on a 60Hz monitor, as the numbers correspond to one another. The monitor will try to fit in those 100 frames produced in a second by the graphics card, but it technically can’t, as it’s only able to refresh itself 60 times in that same second.
This effect is most common when the visuals move horizontally, which happens in side-scrolling games or in first-person shooters, which can be particularly annoying.
The problem with screen tearing is that it instantly ruins the in-game experience by reminding you that you’re sitting in front of the computer and playing a virtual game. This might strike you as an abstract effect but it’s actually a legitimate observation that many people voiced back when this issue was prevalent.
Being immersed in the game makes you forget your surroundings. Screen tearing reminds gamers of glitches, making them worry if there’s something wrong with their PC. It’s a small distraction but just enough to get killed in-game. And, if you’re a gamer, you surely agree that losing due to something that’s out of your control ranks high on the top worst things ever.
Vertical Synchronization (VSync)
VSync is a software solution which avoids screen tearing by limiting the GPU’s output to the monitor’s refresh rate.
Great! Problem solved, right? You’d think.
What VSync actually does is try to lock the images rendered by the GPU to the monitor’s refresh rate. But what happens if the frame-rate drops below that refresh-rate limit which is usually 60Hz?
Well, bad as it may sound, since VSync can’t interact with the display, there’s nothing that can be done in that situation. Instead of screen tearing, we have a new problem: the monitor will wait for a new frame to render while keeping the old one on the screen, which leads to some strange stuttering.
Both AMD and Nvidia initially developed a solution called Adaptive VSync, which still influences the performance from a software point of view. Basically, Adaptive VSync turns VSync off when the graphics card isn’t outputting enough frames to match the monitor’s refresh rate and enables it when the opposite is true.
Adaptive VSync was a primitive and brute-force solution, but it worked. However, both the green and the red team felt like they could come up with better formulas.
Let’s explore their individual solutions.
Nvidia was the first to debut its solution in 2013: the GeForce GTX 650 Ti Boost card. Due to a large difference between Nvidia’s and AMD’s technology getting released, many have suggested that Nvidia invented this technology, but that is simply not the case.
As mentioned before, the idea existed long before either G-Sync or AMD’s FreeSync, but both companies proposed a different approach to it.
G-Sync is similar to VSync in so far as it’ll limit the extra frames being shown if the monitor can’t handle it. However, the hardware solution presented by Nvidia – available exclusively on their cards – is the major breakthrough here.
Nvidia also made monitor manufacturers pay an extra fee to access the G-Sync module, which enables them to run the tech on their monitors.
This exclusive model enabled G-Sync to gain control over the monitor’s refresh rate and over the number of frames being rendered by the GPU. So, if the GPU is rendering 40 FPS, G-Sync will modify the monitor’s refresh rate to 40Hz in order to achieve that perfect synchronization.
However, if the GPU pushes the game’s frame rate over the monitor’s refresh rate, G-Sync will not affect the FPS whatsoever.
Before getting into the later incarnations of G-Sync, let’s take a look at AMD’s proposed solution to better compare the two.
Much like G-Sync, AMD’s FreeSync is a step up from the previous solution VSync, and it also pretty much does the same as Nvidia’s tech. In other words, it synchronizes the monitor’s refresh rate with the GPU’s output.
Despite sounding relatively similar and actually delivering basically the same result, there is a reason why FreeSync is generally considered a better option.
As mentioned earlier, Nvidia makes manufacturers pay for the dedicated G-Sync module; although a similar certification is required for monitors to be FreeSync compatible, the difference is that FreeSync, as the name suggests, is free.
You read it right: display makers don’t have to pay an additional fee to get FreeSync certified, which means consumers are not charged anything extra either.
The reason why people in the tech industry have equaled this to a highway robbery and called it a predatory practice is that Nvidia were the first to market this solution roughly two years before, which is a decade in the tech world.
People’s reservations towards Nvidia come from the fact that they haven’t made their G-Sync free, even after FreeSync came out in 2015. Of course, they’re well within their rights to do so, and surely AMD’s free release of their technology purposefully consisted of a marketing strategy to compete with Nvidia, but some concessions were expected when FreeSync hit the market.
A possible reason for keeping their proprietary module is their dominance of the GPU market, which allows them to maintain their price. But enough with throwing shade at Nvidia. There are legitimate reasons why you should pay for G-Sync.
We’re talking about the strict(er) certification rules that Nvidia has for G-Sync. The issue with FreeSync is that its label can be applied more liberally, even in situations where a monitor’s range is as low as 50-80Hz. This means that refresh rate synchronization will only be available between 50 and 80 frames per second. Below or above that, there’s no synchronization at all.
We’ll shortly get to AMD’s solution for that, but it’s important that basic G-Sync requires the monitor manufacturers to go virtually free-range with their G-Sync capabilities. This means that if your game is running at 20 FPS, there won’t be any screen tearing or stuttering. The same applies if you’re gaming 200 FPS.
Not that AMD is innocent in this competition. The main issue is that they don’t require monitor manufacturers to declare where their range is. They will do it if it’s a flagship monitor and they can brag about it, but if they’re releasing a budget or even mid-range monitor, there’s a possibility that you’ll be kept in the dark about the actual range.
Any good old tech is bound to have different incarnations which further improve their original design. Both AMD and Nvidia have released variations of their respective technologies. We’ll now go over them individually.
Let’s start with something that may change your perception of Nvidia. First announced at CES 2019, G-Sync Compatible offered to make any adaptive sync monitor (including the ones certified by FreeSync) able to run G-Sync, for no added cost to either the manufacturer or the consumer.
When it was originally revealed, Nvidia stated that they had tested hundreds of monitors and only certified 12 of them. They have since added 5 monitors to that list. However, this isn’t without its downsides.
Conveniently, what Nvidia doesn’t mention is that while it’s technically possible to run G-Sync Compatible with any adaptive sync monitor, more often than not there are issues involved. Random blanking or stuttering is enough to make hardware unusable and that’s likely to happen if you try using it with a non-certified monitor.
Don’t be discouraged by this name. AMD has remained loyal to its free of charge policy.
Premium is undeniably an improvement when compared to the basic FreeSync, for example, for the low framerate compensation (LFC), which fixes their previous range issue; but only to a certain degree.
What it does is adjust the monitor’s refresh rate if the GPU’s output dips below the monitor’s FreeSync range. It does so in an appropriate manner, meaning that if the range is 40-80Hz and FPS drops to 30, FreeSync Premium will adjust the monitor’s refresh rate to 60Hz. It’s sort of a duct tape fix, but it has worked so far.
Another way in which AMD seeks to improve upon the range issue is with Premium certification. They state that in order for a monitor to support LFC and, consequently, FreeSync Premium, they need a range of 2.5 or higher. This means that when the upper limit of the range is divided by the lower limit, it must not come below 2.5.
Released at the same time as G-Sync Compatible, at the 2019 CES, G-Sync Ultimate promised to add a ton of new features to an already solid G-Sync tech.
One of the main talking points is its 1400 nits HDR. This enables G-Sync to score much higher in terms of lighting and provide an extra crispy image.
Other capabilities of Ultimate include refresh rate overclocking, variable overdrive, ultra-low motion blur display modes, full matrix backlight, and DCI-P3 color.
Yet the most surprising aspect is the innovative refresh rate overclocking. Overclocking CPUs and GPUs are old news but overclocking monitors are highly useful as they can save you a lot of money.
FreeSync Premium Pro
FreeSync Premium further improved the original FreeSync formula from AMD. It retains its predecessor’s LFC capabilities and promises to deliver the best HDR experience from the AMD side of things.
However, its 400 nits promise were a great deception compared to the previously mentioned 1400 nits from their competitor’s G-Sync Ultimate.
Which One Is The Best?
Considering their various versions, it can’t be outright said which solution is superior. To reach a decision, we’ll compare these different versions with their appropriate competition.
The lowest tier is between G-Sync Compatible and FreeSync. While Nvidia offered variations above and below the base G-Sync tech, AMD improved upon their original design so these two are fairly comparable. Thanks to its availability and despite the lack of added cost on both sides, FreeSync is our choice here.
In the middle ground, we have FreeSync Premium and G-Sync. These are probably the hardest solutions to choose between. While G-Sync is technically a better technology, FreeSync Premium has the advantage of being free. In this case, you should probably make a personal decision based on your budget.
At the top, we have G-Sync Ultimate and FreeSync Premium Pro, and there is a very clear winner. Although FreeSync Premium Pro is free and a great solution, G-Sync Ultimate is a far superior tech and is well worth the added price.
Overall, if you’re getting yourself a new monitor or a new GPU, besides being aware of each side’s refresh rate synchronization capabilities, you should keep in mind that depending on your budget, there is something out there for everyone.