G sync vs. Freesync – improve your gaming experience

g sync vs freesync

What's the difference betwwen g sync vs freesync? We'll give you a clue, for once it's not about the best gaming monitor resolutions. Gaming monitor manufacturers have recently started to trend away from chasing the next incremental improvement in overall resolution and are instead focusing on better refresh rates.

It’s easy to see why, too.

Not all that long ago, when people were used to seeing TV and movies in standard definition, the leap to HD content – 720p, 1080p, and then later 4K – was almost unbelievable. Here's a good youtube video explanation of SD vs HD.

SD vs HD

SD vs HD (from Nvidia demo video SD vs HD (from NVIDIA Demo video)

Today, however, incremental improvements to overall resolutions aren’t producing those same jaw-dropping differences. A 4K monitor is pretty lifelike, and the leap to 8K isn’t all that remarkable. 

The difference in going from 60 Hz refresh rates to 240 Hz refresh rates, however, is a big deal.

Click to Tweet

It’s a real needle mover, especially for PC gamers that are serious about their experience. These gamers are willing and able to spend thousands of dollars on a gaming PC they don’t want handicapped by a low-performance gaming monitor.

Unfortunately, discrepancies between the frames that a PC is capable of putting out and the frames that can be rendered by a gaming monitor very rarely sync up.

This results in screen tearing. And nobody likes that!

Thankfully, though, both AMD and Nvidia (the two “big dogs” in the GPU world) have come up with some pretty innovative solutions to these problems in the form of their monitor synchronization technology.

Nvidia has G Sync technology while AMD has FreeSync technology, and below we dig a little bit deeper into the ins and outs of both of these solutions (as well as give our recommendations for which one you should consider moving forward).

Let’s get right into it!

Quick Overview of Both Sync Technologies

Both G Sync and FreeSync are forms of “adaptive synchronization technology”, a major leap forward in the PC gaming world these days.

For those that are unaware, adaptive synchronization technology (AST) basically means that the monitor refresh rate you are working with is synchronized with the refresh rates that PC GPUs are pumping out – even if the GPU wouldn’t break a sweat doubling that frame.

At first it might sound a little bit crazy to put a muzzle on your graphics card.

After all, who wants to spend a pile of money on a GPU only to strap a limiter on it while freezing the frames that can be rendered from it?

At the same time, though, there aren’t too terribly many people that enjoy screen tearing when they are playing games (particularly competitive games online, like those in the FPS world).

Video - How To Prevent Screen Tearing

Having AST solutions like G Sync and FreeSync that can match your GPU frame output to your monitor refresh rate guarantees smooth sailing no matter what.

Your whole experience as a gamer improves significantly with this kind of technology running under the hood – though choosing which of these two options to move forward with may not be quite as obvious as you might have assumed it to be.

G Sync Feature Breakdown

 G Sync from Nvidia is a proprietary form of AST, which means you’re going to need to be running Nvidia graphics cards in your computer to take advantage of this technology. 

The technology itself was first introduced all the way back in 2013, though, making it some of the most established AST on the market – and that a lot of the bugs, kinks, and issues that plague newer setups have been ironed out completely.

On top of that, Nvidia has been able to innovate G Sync a little more than AMD has been able to read FreeSync thanks to this head start.

G Sync includes proprietary blur reduction technology, guarantees zero frames tearing whatsoever even at sub 30 Hz refresh rates, and performs flawlessly when paired with any Nvidia GPU that was built after 2013.

At the same time, and like every other product from Nvidia, you’re going to pay more to take advantage of monitors that include G Sync technology built right in. That’s just the nature of the beast with this brand (though the price gap has narrowed from north of $200 to just slightly more than $100 in the last few years).

FreeSync Feature Breakdown

 FreeSync, on the other hand, is built on the back of open source technology created in partnership with VESA (the organization responsible for the DisplayPort) and AMD engineers. 

Any interface that runs off of the DisplayPort 1.2a protocol has FreeSync technology built right in, supporting adaptive refresh rates specifically designed to work with AMD and Nvidia cards.

The beautiful thing here is that because of the open-source nature of the technology manufacturers do not have to spend any extra money to include in their hardware. They don’t have to license anything, they don’t have to invest extra money, and they don’t really have to support this technology on the backend, either.

It’s a bit of a “freebie” for hardware companies to include with their gaming monitors, which is why so many have made the decision not only to support FreeSync but to include it as a backup or secondary AST with monitors that already include G Sync, too.

The open-source nature of FreeSync does create some implementation variance issues across different manufacturers, though.

Manufacturers that make more budget-focused hardware are likely only going to include FreeSync support for 60 Hz refresh rates or so, and may not include any support whatsoever for the FreeSync version of blur reduction.

At the end of the day, AMD FreeSync can go toe to toe with Nvidia’s G Sync when they are implemented on similar gaming monitors.

The Impact HDR Makes on Both

The introduction of high dynamic range (HDR) into the gaming monitor world has thrown a bit of a monkey wrench into the mix, though.

Nvidia and G Sync both offer support for HDR (as well as extended color capabilities) without displays necessarily happen to be ULTIMATE certified by Nvidia themselves. 

Modern gaming displays do need to have full HDR technology built-in, extended color capabilities, and the ability to hit 120 Hz refresh rates at 1080p to take full advantage of FreeSync technology, though.

Those requirements are a little bit more stringent than Nvidia (which is a bit of a curveball, considering the history of Nvidia and AMD in the approach they take to overall hardware requirements) – and that’s something you’ll want to focus on for sure.

Another thing you need to remember is that while you can run Nvidia graphics cards with FreeSync monitors you cannot do the same going the other way – AMD GPUs with G Sync enabled gaming monitors – and get the same experience.

You’ll also need to be sure that your Nvidia drivers have been fully updated, and it’s not a bad idea to check to see if you’re FreeSync gaming monitor is officially supported by Nvidia themselves – even if it isn’t technically “certified” by this hardware manufacturer.

These days you also have Dynamic Super Resolution (DSR) which can enhance any resolution above 1080 by rendering it into higher more detailed resolution like 4k. It does this via down sampling or super sampling where it upscales, rendering a game at a higher resolution and shrinks it back down to the lower resolution of your monitor effectively letting you get 4k 3840x2160 graphics on a lower res monitor - although it's not without it's niggles and compatibility issues here and there.

In effect, when coupled with Gsync and Freesync, the DSR can give you higher resolution gaming on a lower resolution monitor, without screen tearing - and that's pretty darn cool.

Click to Tweet

G sync vs Freesync - Which AST Technology Should I Go With?

At the end of the day, picking between these two Adaptive Synchronized Technologies is going to be a uniquely personal decision. Nvidia definitely had the lead on things with G Sync between 2013 and say 2016 or even 2017 – but the gap has narrowed significantly.

G Sync monitors are still (slightly) more expensive than FreeSync monitors are though. But that gap is narrowing significantly (and rapidly), too.

The major reason to go with a FreeSync monitor versus a G Sync monitor is because of the open nature of the technology, as well as the ability to use all of the FreeSync capabilities with Nvidia cards so long their most recent drivers updated. You can see the debate that's raged on Reddit here where  most people give the edge to gsync.

You’ll still want to be sure that your FreeSync monitor has a refresh rate higher than 60 Hz (ideally 120 Hz) if you want to make the most of your AMD or Nvidia GPUs.

Budget focused gaming monitors that hard cap gaming experiences at 60 Hz (and 60 frames per second) are going to seriously put your next-generation PC gaming hardware on a pretty tight leash.

Nobody wants to spend the kind of money that high-end gaming PCs command only to see their investment in hardware artificially officially neutered because their monitor just couldn’t keep up.

When you get right down to it, the differences today between G Sync and FreeSync technology are almost completely nonexistent from a pure functionality and feature set standpoint.

The price may be a difference-maker for you (though most people are going to be willing to pony up the extra cash for a G Sync enabled monitor if that’s what suits them best), but aside from that it really is kind of coin toss when it comes to figuring out which of these are right for your situation.

They could all really come down to whether or not you already have Nvidia gaming hardware or AMD gaming hardware inside of your PC.If you are running an Intel chipset with an Nvidia GPU, the odds are pretty good you’ll see better performance with G Sync.

If, on the other hand, you are running an AMD processor and an AMD GPU that you’ll probably want to go with FreeSync.

So with G sync vs Freesync, based on taking these factors into consideration, it’s really up to you!


1