an introduction, monitors typically operate at a fixed refresh rate, whether
that is 60, 120 or 144Hz. When running graphically intense content like games,
the frame rate will fluctuate somewhat and this poses a potential issue to the
user. There are traditionally two main options available for how frames are
passed from the graphics card to the monitor using a feature called Vsync,
whether it is turned on or off.
the most basic level
allows the GPU to send frames to the monitor as soon as they have been
processed, irrespective of whether the monitor has finished its refresh and is
ready to move onto the next frame. This allows you to run at higher frame rates
than the refresh rate of your monitor but can lead to a lot of problems. When
the frame rate of the game and refresh rate of the monitor are different, things
become unsynchronised. This lack of synchronisation coupled with the nature of
monitor refreshes (typically from top to bottom) causes the monitor to display a
different frame towards the top of the screen vs. the bottom. This results in
on the monitor that really bothers some users. Even on a 120Hz or 144Hz monitor,
where some users incorrectly claim that there is no tearing, the tearing is
still there. It is generally less noticeable but it is definitely still there.
Tearing can become particularly noticeable during faster horizontal motion (e.g.
turning, panning, strafing), especially at lower refresh rates.
solution to this tearing problem for many years has been the
option which essentially forces the GPU to hold a frame until the monitor is
ready to display it, as it has finished displaying the previous frame. It also
locks the frame rate to a maximum equal to the monitor’s refresh rate. Whilst
this eliminates tearing, it also increases
as there is an inherent delay before frames are sent to the monitor. On a 120Hz
monitor the lag penalty is half that of a 60Hz monitor and on a 144Hz monitor is
even lower. It is still there, though, and some users feel it disconnects them
from game play somewhat. When the frame rate drops below the refresh rate of the
monitor this disconnected feeling increases to a level that will bother a large
number of users. Some frames will be processed by the GPU more slowly than the
monitor is able to display them. In other words the monitor is ready to move
onto a new frame before the GPU is ready to send it. So instead of displaying a
new frame the monitor displays the previous frame again, resulting in
Stuttering can be a major problem when using the Vsync on option to reduce
During Vsync ON operation, there can also sometimes be a sudden slow down in
frame rates when the GPU has to work harder. This creates situations where the
frame rate suddenly halves, such as 60 frames per second slowing down to 30
frames per second. During Vsync ON, if your graphics card is not running
flat-out, these frame rate transitions can be very jarring. These sudden changes
to frame rates creates sudden changes in lag, and this can disrupt game play,
especially in first-person shooters.
Variable Refresh Rate -
overcome these limitations with Vsync, both NVIDIA and AMD have introduced new
technologies dubbed G-sync and FreeSync respectively. G-sync was launched mid
2014 with the first screen we tested being the
Asus ROG Swift PG278Q.
FreeSync was not launched until 19th March 2015, with the first
screen we’ve tested being the
Both options currently require a DisplayPort 1.2 interface from the graphics
card and monitor to operate. At the time of writing
AMD are known to be experimenting with providing the same functionality over
HDMI, but it is still early days and only in proof-of-concept stages.
The idea of both technologies is based on variable refresh rates. These
technologies can be integrated into monitors allowing them to dynamically alter
the monitor refresh rate depending on the graphics card output and frame rate.
The frame rate of the monitor is still limited in much the same way it is
without a variable refresh rate technology, but it adjusts dynamically to a
refresh rate to match the frame rate of the game. By doing this the monitor
refresh rate is perfectly synchronised with the GPU. You don’t get the screen
tearing or visual latency of having Vsync disabled, nor do you get the
stuttering or input lag associates with using Vsync. You can get the benefit of
higher frame rates from Vsync off but without the tearing, and without the lag
and stuttering caused if you switch to Vsync On.
G-Sync vs. FreeSync
Both G-sync and FreeSync operate on this principle of dynamically controlling
the refresh rate. There are a few differences between how the technology is
implemented though. NVIDIA G-sync requires a proprietary G-sync module to be
added to the monitor, which comes at quite a high cost premium. You will notice
as a result that the retail price of compatible G-sync monitors is often £100 -
200 higher than similar competitors because of this module. There is another
limitation with adding a G-sync module in that it is only designed to work with
a single interface currently, and so supporting monitors only offer a single
DisplayPort connection. That makes those monitors somewhat restrictive when it
comes to attaching any other devices of computers. Updated G-sync modules will
apparently allow additional connections to be offered, although we’ve yet to see
any displays released at the time of writing. The screens are also provided
without a scaler, and so hardware aspect ratio control options are not offered.
That's not as important as on some screens given you are restricted to a single
DisplayPort interface anyway, and the PC can handle the scaling for you. It does
also mean that signal processing lag is incredibly low as a result, another
positive for gaming.
G-sync modules also support a native
blur reduction mode
dubbed ULMB (Ultra Low Motion Blur). This allows the user to opt for a strobe
backlight system if they want, in order to reduce perceived
motion blur in
gaming. It cannot be used at the same time as G-sync since ULMB operates at a
fixed refresh rate only, but it's a useful extra option for these gaming
screens. Of course since G-sync/ULMB are an NVIDIA technology, it only works
with specific G-sync compatible NVIDIA graphics cards. While you can still use a
G-sync monitor from an AMD/Intel graphics card for other uses, you can't use the
actual G-sync or ULMB functions.
the other hand AMD FreeSync technology costs virtually nothing for a monitor
manufacturer to adopt and so there is no price premium really for supporting
monitors, hence the name. Most of them already had the relevant components in
their supply chains, but need the right software to come along to expose latent
capabilities. With the help of VESA, the DisplayPort Adaptive-Sync specification
was born to do exactly that.
Adaptive-Sync has no unique material or licensing costs, and AMD FreeSync
technology builds on top of that industry standard to give gamers a benefit in
all of their games. No licensing. No proprietary hardware. No incremental
hardware costs. So as the name suggests, the key advantage of FreeSync really is
in the cost!
since you don't need a dedicated extra module added to the screen you can still
offer multiple video inputs on the screen without problems. You can only use
FreeSync over DisplayPort, but there's no issue with including HDMI, DVI, D-sub
etc to offer the user multiple interface options. You're not limited to just a
single DisplayPort like with G-sync is currently. Scalers can also be provided as normal for
hardware aspect ratio control although there may be additional signal processing
lag added depending on the electronics and scalers manufacturers use. They will
need to focus closely on reducing lag as they do with current non-FreeSync
monitors. There is no native blur reduction mode coupled with FreeSync support
so it is down to the display manufacturer whether they add an extra blur
reduction method themselves. FreeSync can only be used from
AMD graphics cards,
and you cannot use FreeSync from an NVIDIA card. You can still use a FreeSync
monitor from an NVIDIA card without problems, just not the actual FreeSync
We don't want
to get in to any kind of NVIDIA vs. AMD debate here. What this really boils down
to is whether you're an NVIDIA or AMD graphics user. At the moment there isn't a
single standard which works from all graphics cards, so you need to pick a
monitor to match your graphics choice. AMD's option is cheaper and more
versatile for manufacturers to adopt but we don't feel that it will mean we will
have a much larger selection of FreeSync monitors to choose from. At the end of
the day the monitor manufacturers need to cater for their audience, and with
such a huge market share from NVIDIA they would be mad to ignore G-sync
offerings. Maybe at some point there will be a common approach between NVIDIA
and AMD but with both technologies being so new at the moment, we can't see that
happening for a while. G-sync may be more expensive, and limited when it comes
to connection options at the moment, but there is the added benefit of the
native ULMB included don't forget.
information about G-sync
and for more information on FreeSync, see
G-Sync vs. FreeSync
Included ULMB Blur
Low processing lag
since no scalers included
NVIDIA has larger
graphics card market share than AMD
Very few bugs or
issues with G-sync operation since introduction
Very low cost to implement
Multiple interface options can still be
Scalers can still be provided
Vsync on/off option for frequencies out
of supported FreeSync range
Some very minor performance benefits over
No integrated blur reduction mode
Possible additional processing lag added
Some early teething problems with
FreeSync affecting overdrive circuits on monitors.
There are plenty of reviews and tests of G-sync online which cover the operation
of G-sync in more detail. Our friends over at
have done some G-sync testing in various games which is well worth a read.
They've also carried out
various lag tests
which have confirmed that using G-sync doesn't seem to add any noticeable lag,
compared with running with Vsync off.
options in the NVIDIA control panel
should be noted that the real benefits of G-sync really come into play when
viewing lower frame rate content, around 45 - 60fps typically delivers the best
results compared with Vsync on/off. At consistently higher frame rates as you
get nearer to 144 fps the benefits of G-sync are not as great, but still
apparent. There will be a gradual transition period for each user where the
benefits of using G-sync decrease, and it may instead be better to use the
included, which is not available when using G-sync. Higher end gaming machines
might be able to push out higher frame rates more consistently and so you might
find less benefit in using G-sync. The ULMB could then help in another very
important area, helping to reduce the perceived motion blur caused by LCD
displays. It's nice to have both G-sync and ULMB available to choose from
certainly on these G-sync enabled displays. Very recently NVIDIA has added the
option to choose how frequencies outside of the supported range are handled.
Previously it would revert to Vsync on behaviour, but the user now has the
choice for Vsync on or off.
can support dynamic refresh rates between 9 and 240Hz but the actual supported
ranges depend on the display, and this does vary. When you connect the display
with the relevant driver package installed the display is detected as FreeSync
compatible and gives the following pop up message:
Within the Catalyst Control Centre there is an added configuration option for
FreeSync at the bottom as shown in the screenshot below. Once enabled, FreeSync
ON is often also confirmed in the OSD menu of the display somewhere.
don't want to go into too much depth about game play, frame rates and the
performance of FreeSync here as we will end up moving away from characteristics
of the monitor and into areas more associated with the operation of the graphics
card and its output. FreeSync is a combined graphics card and monitor
technology, but from a monitor point of view all it is doing is supporting this
feature to allow the graphics card to operate in a new way. We'd encourage you
to read some of the FreeSync reviews online as they go into a lot more detail
about graphics card rendering, frame rates etc as well.
Within the AMD press material and presentation which we were invited to, they
had carried out some tests of FreeSync and noted a minor improvement in frame
rates when FreeSync was enabled. We're talking extremely minor here, up to about
half a frame difference at best. Still, it's better than any drop in
performance! On the other hand they found a minor drop in frame rate performance
when testing NVIDIA G-sync, down by a couple of frames at most. NVIDIA
this minor performance drop in the past although said they were working on it.
For those interested, there's some
more information here
about the test environments used by AMD.
results obtained by AMD were as follows:
Above: AMD tests
of FreeSync and G-sync Frame rates
AMD concluded from their tests that enabling
FreeSync maintains a more consistent performance vs. the competition:
Really any difference here is extremely minor and
won't make any practical difference to the user.
Anandtech actually carried out some further tests themselves and found no
discernable difference between the two solutions. Perhaps there's some very
minor difference somewhere, but not something to worry about at all. We only
include it here as you're likely to hear about this in any AMD vs. NVIDIA
In addition to these tests, AMD also checked how
each solution behaves if you operate outside of the supported range. i.e. what
happens if you provide a frame rate above the maximum supported 144Hz, or below
the bottom end of the range supported by the monitor. With AMD FreeSync you have
the option as a user to either have V-sync on or off for operation above the
maximum supported refresh rate. So if you have a powerful enough system you are
able to output more frames if you want. Obviously you're back into the realms of
possible tearing etc with V-sync off, or into the realms of some possible lag
and stutter with V-sync on, but you have the choice as the user at least.
Their tests here confirm that operation, using
V-sync off when out of range of FreeSync. Again test environment
described here if you want more info.
It should be noted that the real benefits of variable refresh rate technologies
really come into play when viewing lower frame rate content, around 40 - 75fps
typically delivers the best results compared with Vsync on/off. At consistently
higher frame rates as you get nearer to 144 fps the benefits of FreeSync (and
G-sync) are not as great, but still apparent. There will be a gradual transition
period for each user where the benefits of using FreeSync decrease, and it may
instead be better to use a
Blur Reduction feature
if it is provided. On FreeSync screens this is not an integrated feature
however, so would need to be provided separately by the display manufacturer.
If you appreciate the article and enjoy
reading and like our work, we would welcome a
to the site to help us continue to make quality and detailed reviews and
articles for you.