You probably have heard of FreeSync and seen it often compared to G-Sync and VSync, and all that can get complicated really quickly. Both are basically solutions to the same problem, with AMD’s FreeSync (and its other iterations) being the focus for today.
Before we dive into what different incarnations of FreeSync bring to the table and where they differ from one another, it’s important to first understand the problem they’re trying to fix, as well as other solutions to the same issue.
Of course, we’re talking about the dreaded screen tearing. This is an effect that happens when either the graphics card is producing more frames than the monitor can display or the opposite when monitor refreshes faster than the GPU has frames to send to it.
This is particularly annoying when you invested in a high-performing GPU while keeping your old 60Hz monitor, and you’re looking to play an FPS multiplayer game only to have your surroundings tear in two, and cost you some much-valued kills.
Table of ContentsShow
So, FreeSync Solves Screen Tearing?
Well, yes. But, it’s also not the only technology that aims to solve tearing, nor is it the first, and to best understand why we have FreeSync today and what are the best uses for it, we must consider other solutions as well.
Vertical Synchronization – VSync
Probably the first thing that comes to mind as a VSync option is how it has been a staple of PC games’ video settings for a while.
VSync can be considered the original technology for this problem and although it’s far from ideal, it can produce expected results.
In essence, vertical synchronization will forcefully lock your GPU to 60 FPS in order to prevent the monitor from playing catch-up and to have monitor’s refresh rate and graphics card’s output be, well, synced up. Although this would mean your high-end GPU won’t have the opportunity to perform to the fullest of its capabilities, it does solve the screen tearing issue.
Well, in that particular situation it does. If GPU’s performance dips below 60 FPS as it can happen due to the variability of game graphics required for production, then the monitor will leave the last image rendered. Although when this happens, it happens so fast that it can’t even be said it’s happening in a split-second.
The annoyance remained when VSync was first introduced to the video gaming world, although now it was slightly different. Nvidia was first to offer a competing solution.
Initially released in early 2013, G-Sync is a hardware solution that aims to fix the problem of screen tearing. What G-Sync does is allow the display’s refresh rate to adapt to the graphics card.
Nvidia achieved this by developing a feature for collision avoidance in which when a new frame is ready to outputted while a duplicate of that frame is already on the screen, the new frame will expect the refresh and wait.
The biggest issue with this is that Nvidia made the monitor manufacturers use the G-Sync module that Nvidia sells.
You may be wondering why does that affect you as a consumer. Overall, you won’t notice any issue in your gaming, but that comes at a price. Display developers have to pay Nvidia to get that module so they have to increase their prices to balance the books.
And of course, G-Sync is only available for Nvidia’s GPUs.
Another exclusivity here is that G-Sync runs only via DisplayPort 1.2, whereas FreeSync was originally based on DisplayPort 1.2a and now uses HDMI 1.2+.
FreeSync is AMD’s solution to the screen tearing and although it came almost exactly two years after G-Sync, many consider it a better solution. The thinking behind that rationale isn’t in the performance department, where these two technologies are pretty even-keeled as a matter of fact.
For that particular reason, it’s especially infuriating that G-Sync still charges money for their tech, while AMD has allowed FreeSync, as the name suggests, to be used for free since its launch.
This is further highlighted with Nvidia’s requiring G-Sync-ready monitors while AMD’s uses VESA’s open Adaptive-Sync standard. This makes FreeSync much more readily available and lowers the prices of monitors that support it.
This openness doesn’t signify that manufacturers can simply label their monitors as “FreeSync Ready” and move on. Like with G-Sync, they need to pass certain requirements set by AMD, but hey, at least that certification doesn’t cost a thing.
It’s also interesting to note that there is a way to enable G-Sync to run on a FreeSync monitor, although that requires some tinkering with the settings and doesn’t always return the best results. Of course, this division doesn’t mean an Nvidia card can’t run on a FresSync monitor or vice versa; in fact, they can run pretty easily, but they won’t be able to support their signature technologies.
FreeSync works pretty much the same way G-Sync does in that it dynamically adjusts and synchronizes the monitor refresh rate and frames per second being outputted so that there’s no screen tearing whatsoever.
Another thing that both these technologies fix, and that has been an issue with VSync is input lag. With VSync, due to the difference between frames actually being worked on and those seen on screen, this has been its biggest point of contention.
Even though it was not specifically designed to be easier on the eyes, FreeSync can technically boast about bringing lowered levels of flicker and that shouldn’t be as easily ignored as it is.
Just the name of it kind of sounds like there is a separate version that costs extra and almost feels like a cheap shot from AMD based purely on the naming which is eerily reminiscent of the detested freemium model in the video game world. However, that is absolutely not the case here.
Introduced to the world at CES 2020, FreeSync Premium looks to build on its predecessor and retain its features while also adding its own unique touches.
One of the innovations is a low framerate compensation (LFC) which handles the framerate dropping below monitor’s range. Say if the FPS drops to below the monitor’s 30Hz range, LFC will increase the monitor’s refresh rate with a consistent ratio. So, if the game is at 25 FPS, LFC will set the refresh rate 50Hz and that will still preserve the gamer from being affected by screen tearing.
Another cool thing about Premium is that it requires at least a 120Hz refresh rate when gaming at full high-definition resolution (1920 × 1080).
The downside of FreeSync Premium is that it’s a relatively new technology and not a lot of monitors out there support it. Of course, this is purely an age issue as it’s certain that as time goes on, the number of supported monitors will increase.
FreeSync Premium Pro
It’s obvious that AMD struggled with naming this one if this is the best they came up with. It was originally known under a similarly creative name of FreeSync 2 HDR, but it seems that they needed a way to convey that Premium Pro is a step above Premium.
This edition is aimed at those playing at HDR which is going to be of use when Big Navi finally gets on the ray-tracing bandwagon.
What this HDR (high-dynamic-range imaging) support means for Pro is that it will deliver smooth HDR performance while FreeSync and FreeSync Premium will be limited due to processing bandwidth.
Like Premium, Pro retains all of its predecessor’s features, including Premium’s LFC.
Which One Is The Best For You?
Although each of these incarnations of FreeSync eliminates the core issue of screen tearing, it’s very obvious that FreeSync Premium Pro is the best option right now.
With ray tracing coming to the AMD’s brand of GPUs, getting a monitor that can support HDR is a must if you’re looking to have extraordinary visuals in your games.
It has been announced that both DisplayPort 1.4 and HDMI 2.1 feature native VRR (variable refresh rate) so it would be wise to maybe hold off on getting a FreeSync monitor if that’s the deciding factor as alternatives are gaining traction.
Although, it’s also worth mentioning that research into diverging prices has been inconclusive.
It appears that the general consensus at the time around the world of technology is still unanimously saying to hold off on getting new tech, just a little bit more.