G-sync is one of two standards for variable sync/adaptive refresh, the other being FreeSync (and now FreeSync 2). G-sync is Nvidia’s, while FreeSync is AMD’s. More monitors support FreeSync because it’s an open, free standard, while G-sync requires certification/licensing from Nvidia. This makes the monitors that support G-sync a little more expensive, also because of the additional built-in hardware for monitor to handle it.
Adaptive refresh is better for gaming as it reduces tearing from frame rate/display refresh rate mismatch - there are better explanations than I can give as to why - but a lot of the benefit from newer monitors also comes from higher refresh rate, which is allows for more fluid and smoother motion in games and high frame rate videos, as well as for UI/UX (just using Windows is a little smoother).
100Hz, 120Hz, 144Hz (most common high refresh now), and 240Hz offer visibly smoother experience than the common 60Hz LCD panels, though it may not be as obvious until you actually compare monitors side by side, or try going back to a monitor without high refresh after using one that does have it.
The monitor I am using is the Alienware AW3418DW. It’s native 100Hz but overclocks (just a switch in the monitor settings) to 120Hz.
Choosing between FreeSync and G-Sync is a whole can of worms, though FreeSync 2 is easier to recommend now because AMD wasn’t as strict with FreeSync 1, which led to monitors of varying capabilities with the “FreeSync” badge. But whichever you choose, you will have to match your monitor with your graphics card (though now there are some FreeSync monitors that Nvidia cards will work with, as Nvidia loosened up just a bit).
I went with a G-sync monitor because it paired well with the GTX 2070 mini in my PC, but I could have easily gone AMD for GPU if adaptive sync were the only consideration.
I don’t have specific monitor recommendations, but can help you narrow it down. Mostly it comes down to use case (whether you want it primarily for gaming, professional graphics use, movies/TV viewing, or mixed use), what size you want, what GPU you plan on using with it, etc. Color accuracy, HDR support, resolution, refresh rate range, viewing angles, response time, and more vary between monitors.
I chose mine based on size and it being fairly color accurate while also having 1440p resolution and high refresh, but it doesn’t have HDR or support 4K. Trying to get everything (4k, high refresh, HDR, larger than 27") and you end up in the $1,500+ monitor market.f