If you’ve ever tinkered with the video settings of a game, you have likely stumbled across the option to turn VSync on or off.
In the majority of cases, this is not accompanied by any explanation as to what VSync is, and the name itself does little to clear up the confusion.
VSync is short for Vertical Synchronization, and it is used to fight the problem of screen tearing.
In essence, this issue occurs when your graphics card produces more frames than your monitor is capable of displaying on your screen.
For example, if your GPU can output 80 frames per second and your monitor has a 60hz refresh rate, the monitor will attempt to squeeze those 80 frames into its 60 refresh cycles. This causes some parts of the screen to be displayed out of sync, resulting in screen tearing.
Table of ContentsShow
What Is VSync?
Although PC gaming has now existed for a few decades, it wasn’t until the rapid technological advancement of graphics cards in the 2000s that we encountered the issue of screen tearing. This is because GPU technology quickly moved forward while monitor technology progressed far more slowly.
Screen tearing was accepted by some gamers as simply playing a waiting game until monitors with faster refresh rates were developed. Fortunately, some innovative people had other ideas.
VSync is the original solution developed to tackle the screen tearing problem. It involves limiting the GPU output software to match the monitor’s maximum refresh rate. Theoretically, this sounds like a perfect solution, but there are several issues and these are the reason why VSync is largely considered obsolete in 2023.
One of these issues occurs when the GPU is unable to perform up to the monitor’s refresh rate. As a result, the monitor will leave the previous image on display until the next one is ready, which causes visual stuttering.
Fortunately, there are now technologies much better equipped to deal with this problem, but more on those later.
Likely the biggest issue with VSync is its input lag. This is particularly frustrating in games where quick reactions are necessary, such as a shooter game. It’s an even greater issue if the game in question is a multiplayer game, where your opponent could end up beating you simply because they use a different syncing solution.
Should You Use VSync?
From what has been said so far, the simplest answer would be yes, you should use VSync. The advantages are clear, while the disadvantages are less likely to hinder you.
However, the more nuanced answer is that you should only use VSync if it’s absolutely necessary. The reason for this is simple: there are better alternatives.
As mentioned previously, VSync technology is quite rudimentary and it didn’t take long for other GPU giants to provide their own solutions.
First and foremost, we have to give credit where it’s due; VSync was the original solution. It was a good solution at the time and became a staple of graphics settings throughout the ensuing decade.
While we can commend VSync for its success, we have to admit that its time has largely passed and that there are now better alternatives available. Let’s take a look at which one is currently the best.
First up, we have AdaptiveSync, which is different from NVIDIA’s Adaptive VSync (notice the extra ‘V’). AdaptiveSync is the only technology mentioned here that wasn’t developed by either AMD or NVIDIA. It was developed by VESA, the organization responsible for the DisplayPort standard, which is widely used today.
To avoid confusion, both AMD and NVIDIA are members of VESA, but they weren’t part of the development process. AdaptiveSync is also a free standard, which means any member of VESA can use it. Both AMD and NVIDIA used it to develop their own brand-specific screen-tearing solutions.
Probably the best thing about this standard is the way it smooths out stuttering when the FPS drops below the monitor’s refresh rate. It’s so good that it can feel like there are more frames than there actually are.
AdaptiveSync manages to pull this off by allowing an image to be displayed as soon as it’s completely rendered while keeping the previous image up in the meantime. How is that different from VSync?
It’s quite simple, really. AdaptiveSync changes the refresh rate of the monitor and forces it to wait until the frame is ready before loading it up. You can take a look at the image below to better visualize this.
FastSync is NVIDIA’s version of AdaptiveSync and an upgrade of its own Adaptive VSync, which was considered a bit of a mess. As the industry leader that it is, NVIDIA quickly responded to its own failure with FastSync.
FastSync attempts to achieve the same thing as the AdaptiveSync standard but runs into some issues where stuttering and chopping are more noticeable. It does its job, but we can only recommend using it if you’re gaming online.
It’s worth pointing out that, much like the tech it’s trying to emulate, it’s still a better choice than VSync.
While the previous entry was NVIDIA’s attempt at fixing the vertical synchronization issues, this is AMD’s crack at it.
There isn’t much more to say about the way AMD addressed this. Enhanced Sync can largely be considered the same as FastSync, but with AMD’s branding all over it.
The same issue that affects FastSync, occasional stuttering, is also noticeable here. Although both technologies were developed to eliminate stuttering, neither succeeded entirely. However, they were both great attempts, considering their respective release dates. All in all, Enhanced Sync does manage to decrease the stuttering level of VSync.
Now we’re cooking with gas. G-Sync is NVIDIA’s variable refresh rate solution for screen tearing and we’re happy to report that it works perfectly.
For NVIDIA, being the first company to release a groundbreaking product has its benefits. NVIDIA was determined to take full advantage of this and sought to leave no money on the table.
That’s right, as much as G-Sync is the perfect solution for screen tearing, it also comes at a price.
For NVIDIA to be able to solve the problem of adjusting the monitor’s refresh rate, they had to invent a module that is implemented directly into the monitors themselves. This module, in conjunction with the software solution, is able to dynamically change the monitor’s refresh rate to match the frames produced by the GPU.
This excellent and innovative solution requires the installation of the module by monitor manufacturers. As such, NVIDIA chose to charge money for it, hence the additional cost for consumers.
The key reason why G-Sync is a better technology than VSync is that it eliminates the input lag. The major drawback since the invention of VSync is finally a thing of the past.
This is the latest addition to the G-Sync brand and can finally somewhat justify the additional cost. NVIDIA made sure that this was noticeable and required a thorough and demanding inspection to be passed by monitor manufacturers in order to implement their G-Sync module.
The reason for this is simple: G-Sync Ultimate brings a lot to the table. NVIDIA managed to add 152 backlight zones, which means the IPS panel is capable of producing HDR images with additional precision.
As you might have guessed, this is AMD’s screen-tearing solution. However, AMD decided to allow their module to be completely free, hence the name FreeSync.
Its reasoning is probably twofold. NVIDIA was the first company to reach the market and AMD lagged behind. NVIDIA also chose to charge money for their product, so AMD making its solution free could give it a competitive edge.
To avoid any uncertainty, we have to admit that FreeSync is equally as good as G-Sync and the fact that it’s free makes it a more attractive option. However, things aren’t always that simple and that’s also the case here.
One of the reasons why NVIDIA charges monitor manufacturers for the use of its G-Sync module is to ensure the exclusivity of its cards. G-Sync monitors can only work with NVIDIA cards.
One might think that AMD would do the same but the reality is that FreeSync monitors can be used with both AMD and NVIDIA cards. There is also a caveat here, as not all FreeSync monitors are compatible with NVIDIA cards.
In 2019, NVIDIA shocked the world when they announced that their G-Sync cards would have support for FreeSync monitors. Unsurprisingly, there are rigorous standards to be met, but the fact remains that the best option is currently a FreeSync monitor.
Much like G-Sync technology, FreeSync reduces the frustrating input lag. Something else that AMD can boast about is FreeSync’s reduced flicker, which really comes in handy in those long gaming sessions.
AMD attempted to improve upon their FreeSync model with FreeSync Premium, and we can say that they did a really good job.
Premium brings a host of improvements, with its most notable being the Low Framerate Compensation. This innovation responds to the frame rate dropping below the monitor’s range by doubling the FPS number and using it as the monitor’s refresh rate. This eliminates screen tearing and, although the frame rate will be lower, it will at least be consistent.
FreeSync Premium Pro
As its name suggests, Premium Pro is another upgrade of FreeSync technology. In this case, it aims to maintain stability as AMD’s ray-tracing-enabled cards are released.
Much like G-Sync Ultimate, Premium Pro aims to maintain the proper refresh rate of the monitor to display HDR images that are produced by the latest graphics cards.
Why You Shouldn’t Use VSync
As suggested, VSync simply isn’t an optimal solution. The worst part is that it causes input lag, which no gamer wants to deal with.
If available, we would recommend either G-Sync or FreeSync, depending on which graphics card you have. As mentioned earlier, FreeSync appears to be a better choice at this point with even NVIDIA adapting to it.
It’s important to point out that, if your monitor is neither G-Sync nor FreeSync compatible, you’ll have to go for VSync, which should eliminate screen tearing anyway.