How adaptive sync works (and why V-Sync is the wrong fix)
Screen tearing happens when your GPU finishes rendering a frame mid-refresh. Your monitor draws one frame from the top and a different one from the bottom, creating a horizontal split. It looks awful in fast-moving games, especially in competitive shooters where you’re panning quickly.
The classic fix is V-Sync, which forces your GPU to wait until the monitor finishes a full refresh cycle before sending the next frame. No more tearing — but you pay for it with input lag. At 60 Hz, V-Sync can add up to 16ms of delay. Your character visibly lags behind your inputs. For competitive gaming, that’s worse than the tearing.
Adaptive sync solves this at the hardware level: instead of your GPU waiting for the monitor, the monitor waits for your GPU. When a frame is ready, the monitor refreshes — whether that’s at 73 Hz, 112 Hz, or 138 Hz. No tearing, no added lag. That’s the idea behind G-Sync, FreeSync, and HDMI VRR. They’re three implementations of the same underlying concept. Understanding the differences between them determines whether a $200 price premium is worth paying.
Verified April 2026 — monitor specifications and pricing reflect current market conditions. G-Sync hardware availability will shift as MediaTek-integrated displays reach market.
If you’re optimising your full PC setup beyond just the display, our complete PC optimization guide covers every setting that affects real-world frame rates.
G-Sync, FreeSync, and HDMI VRR: the same idea, different implementations
G-Sync is NVIDIA’s proprietary implementation. Traditionally, it required a dedicated hardware module inside the monitor — a custom NVIDIA chip that managed the variable refresh rate signal independently of the display’s scaler. That module is what made G-Sync monitors expensive. It also gave G-Sync specific advantages: variable overdrive (more on this below), a VRR floor down to approximately 1 Hz, and tightly tuned voltage regulation across the entire refresh range [1].
In August 2024, NVIDIA partnered with MediaTek to integrate G-Sync functionality directly into a combined scaler chip, eliminating the need for a separate module. The cost implications for new G-Sync monitors are expected to be significant, though monitors using the old module design are still common on the market [3].
FreeSync is AMD’s implementation, built on VESA’s open Adaptive-Sync standard. Because there’s no proprietary hardware required, FreeSync costs monitor manufacturers nothing to implement — which means it costs you nothing extra. It comes in three tiers:
- FreeSync: Basic variable refresh rate, no low framerate compensation required
- FreeSync Premium: Minimum 120 Hz at 1080p, Low Framerate Compensation (LFC) mandatory
- FreeSync Premium Pro: All Premium requirements plus HDR support (90% DCI-P3 minimum, formerly called FreeSync 2 HDR) [4]
HDMI 2.1 VRR is a third standard baked into the HDMI 2.1 specification itself — not a GPU brand’s technology, but part of the cable and port standard. It’s what PS5 and Xbox Series X use for adaptive sync. Xbox also natively supports FreeSync, while PS5 uses HDMI 2.1 VRR only [7].
G-Sync Compatible is NVIDIA’s bridge: since 2019, NVIDIA GPUs (GTX 10-series and newer) can use FreeSync monitors via Adaptive-Sync. NVIDIA certifies monitors that pass their validation testing as “G-Sync Compatible.” Non-certified FreeSync monitors may still work with NVIDIA GPUs, but NVIDIA doesn’t guarantee the experience [1].
FreeSync tiers: the badge matters less than the VRR floor
The FreeSync tier printed on a monitor box tells you less than the VRR range in the spec sheet. Here’s why that matters.
Low Framerate Compensation (LFC) is the feature that keeps adaptive sync working when your FPS drops below the monitor’s minimum VRR floor. If your game drops to 25 fps on a monitor with a 30 Hz floor, LFC doubles the refresh rate — so 25 fps runs at 50 Hz — maintaining sync. FreeSync Premium and Premium Pro both require LFC; basic FreeSync does not [5].
The issue is that the minimum VRR floor varies between monitors at every tier. A monitor labeled “FreeSync Premium” might have a 48 Hz floor. A monitor labeled plain “FreeSync” might have a 1 Hz floor. If you routinely game at 60–80 fps, the monitor with the 48 Hz floor gives you a narrower useful VRR window — LFC will trigger more often. The 1 Hz floor model keeps sync active across virtually your entire performance range.
Always check the actual VRR range in the monitor specifications, not just the tier badge. The practical number is “minimum VRR Hz to maximum refresh Hz.” A 1–144 Hz range is better for variable-fps gaming than 48–144 Hz regardless of which badge is printed on the box.
Variable overdrive: the technical gap that actually matters
This is the one difference between G-Sync and FreeSync that has real, visible consequences — and it’s the one most articles skip explaining.
Overdrive is the voltage boost monitors use to speed up pixel transitions. Too little overdrive and you get ghosting (slow pixels trailing behind fast movement). Too much and you get inverse ghosting (bright coronas ahead of objects). Monitors set overdrive for a fixed refresh rate at the factory.
With variable refresh rate active, your monitor’s refresh rate changes constantly — frame to frame. At 144 Hz, those pixels need to transition in about 7ms. At 60 Hz, they have 16ms. The same overdrive setting that works at 144 Hz is too aggressive at 60 Hz, producing inverse ghosting. The setting tuned for 60 Hz produces regular ghosting at 144 Hz.
G-Sync solves this in hardware: the module adjusts overdrive dynamically to match the current refresh rate. At 60 fps, it applies gentle overdrive. At 144 fps, it cranks it up. The result is clean pixel transitions across the entire VRR range [1].
FreeSync monitors use software overdrive that adjusts in steps — or doesn’t adjust at all on budget models. In practice, the gap shows up most at mixed frame rates: if your game swings between 60 and 120 fps, you may see occasional ghosting on FreeSync monitors. Premium FreeSync monitors with better overdrive firmware handle this well. Budget FreeSync models do not. G-Sync Compatible certified monitors fall somewhere in between, with NVIDIA validating that the experience is acceptable before granting certification.
Open-world titles like Cyberpunk 2077 — which routinely swing between 50 and 100 fps in a single session — reveal this gap clearly: on a G-Sync monitor, overdrive adjusts and transitions stay clean; on a budget FreeSync model with fixed overdrive, faint coronas appear around fast-moving characters whenever fps drops sharply.
2026 cost reality: is G-Sync hardware still worth it?
The traditional argument against G-Sync was simple: you pay $100–300 extra for a hardware module that most users can’t distinguish from a good G-Sync Compatible monitor. That argument is mostly still true in early 2026.
Current premiums for monitors still using the dedicated G-Sync module: the Dell AW2524H (500 Hz G-Sync) runs over $300 more than comparable FreeSync 500 Hz alternatives. 1440p 240 Hz G-Sync models carry a $200 premium over FreeSync equivalents [2]. For most buyers, paying that premium to get marginally better overdrive behavior in mixed-framerate scenarios is hard to justify.
The MediaTek integrated scaler (announced August 2024) is changing this going forward. Monitors using the new combined chip will carry G-Sync functionality without the separate module’s cost premium [3]. Watch for 2025 and 2026 monitor releases — if the price gap between G-Sync and FreeSync equivalents narrows to under $50, the value calculation shifts.
When G-Sync hardware is still worth it today: high-end 1440p 360 Hz IPS monitors with backlight strobing, where G-Sync’s precise overdrive control and backlight strobe integration produce visibly better motion clarity than FreeSync alternatives [2]. That’s a narrow use case.
The GPU-side picture is also changing. DLSS 4 and FSR 4 frame generation both interact with adaptive sync — if you’re upscaling, the frame pacing behavior affects how smoothly your VRR operates. Our DLSS vs FSR vs XeSS 2026 comparison covers how each upscaler affects perceived smoothness alongside adaptive sync.
OLEDs and VRR flicker: the warning no one puts in these articles
If you’re considering an OLED gaming monitor with any form of adaptive sync enabled, read this first.
OLED panels modulate brightness differently from LCD panels. When the refresh rate changes — as it constantly does with VRR enabled — the panel’s brightness level shifts slightly with each transition. In a well-optimized game with stable frame times, this is imperceptible. In a poorly-optimized game with frame time spikes (common in open-world titles and early access games), those brightness changes become visible flicker. Some users find it barely noticeable. Others find it genuinely nauseating.
This affects both G-Sync Compatible and FreeSync modes on OLEDs [6]. The one OLED ever manufactured with a full G-Sync hardware module — the Alienware AW3423DW QD-OLED — handled this better due to the module’s tighter voltage regulation, but it’s discontinued. Current OLED monitors do not have this advantage.
The practical workaround: cap your framerate with RTSS or an in-game limiter to within 5–10 fps of your target (e.g., cap at 138 if your monitor is 144 Hz). Stable frame times reduce VRR fluctuations. You can also narrow the VRR range in Custom Resolution Utility (CRU) — for example, setting a 90–144 Hz floor so LFC triggers less aggressively. Neither fix eliminates flicker entirely in bad frame-time scenarios, but both reduce it significantly [6].
Console gaming: why HDMI 2.1 VRR is what actually matters
The G-Sync vs FreeSync debate is almost irrelevant for console gamers. Both PS5 and Xbox Series X implement variable refresh rate through HDMI 2.1 VRR — a standard defined in the HDMI spec itself, independent of GPU brand technology [7].
Xbox goes further and also supports AMD FreeSync natively over HDMI, giving Xbox owners VRR compatibility with a wider range of monitors. PS5 sticks to HDMI 2.1 VRR only — it has an AMD GPU, but Sony chose not to license FreeSync branding.
The catch for dual-use PC and console setups: G-Sync and FreeSync typically require DisplayPort. Console VRR requires HDMI 2.1. If you want both — a single monitor for your PC (NVIDIA GPU, G-Sync Compatible) and your PS5 (HDMI VRR) — you need a monitor with both a DisplayPort input (for the PC) and an HDMI 2.1 input (for the console). Most modern 1440p and 4K gaming monitors have both. Verify before buying if this is your setup.
Which adaptive sync should you use?
| Your situation | Recommendation | Why |
|---|---|---|
| NVIDIA GPU, any monitor budget | Any certified G-Sync Compatible FreeSync monitor | Identical experience to full G-Sync for 95% of users, no price premium |
| AMD GPU | FreeSync Premium or Premium Pro | Native support, LFC required for consistent low-fps behavior |
| Competitive shooter player (160+ fps) | FreeSync or G-Sync Compatible — any tier | At high fps your VRR range is narrow; variable overdrive differences invisible |
| Mixed-use player (60–120 fps games) | G-Sync Compatible certified or Premium FreeSync with good overdrive | Variable frame rate swings expose fixed-overdrive weaknesses |
| OLED monitor buyer | FreeSync Premium + FPS limiter enabled | Reduces VRR flicker from frame time spikes; no full G-Sync module available new |
| Console only (PS5 or Xbox) | HDMI 2.1 VRR support — ignore G-Sync/FreeSync branding | Consoles use HDMI VRR; brand names are irrelevant |
| PC + console dual setup | FreeSync monitor with DisplayPort + HDMI 2.1 | DisplayPort for PC G-Sync Compatible, HDMI 2.1 for console VRR |
| Budget buyer | Any FreeSync monitor — skip tier badge, check VRR floor | Free to implement, widely available; VRR floor (not tier) is the spec that matters |
For a deeper look at all the settings that affect your gaming performance — not just adaptive sync — our game settings explained guide covers resolution, refresh rate, frame caps, and how they interact.
How to enable adaptive sync on your GPU and monitor
G-Sync or G-Sync Compatible (NVIDIA):
- Open NVIDIA Control Panel → Display → Set up G-SYNC
- Check “Enable G-SYNC/G-SYNC Compatible”
- Choose Full screen mode or Windowed and full screen mode
- Apply and confirm on your monitor’s display [8]
If the option doesn’t appear, go to Manage 3D Settings → Global → Monitor Technology → set to “G-SYNC Compatible.” Also enable VRR in your monitor’s on-screen display (OSD) — this is often called “Adaptive Sync,” “FreeSync,” or “VRR” in the monitor menu.
FreeSync (AMD):
- Enable VRR/FreeSync in your monitor’s OSD first
- Open AMD Radeon Software → Display → toggle AMD FreeSync on
V-Sync interaction: With G-Sync or FreeSync active, disable in-game V-Sync. V-Sync fights adaptive sync for frame timing control and adds latency. The exception: NVIDIA recommends enabling V-Sync in the NVIDIA Control Panel (not in-game) as a frame cap when your fps exceeds your monitor’s max refresh rate — this prevents tearing above the VRR range without adding significant lag.
Console VRR (PS5 / Xbox): On PS5, enable VRR in Settings → Screen and Video → Video Output → VRR. On Xbox, go to Settings → General → TV and display options → Video modes → toggle Variable refresh rate.
FAQ
Can I use G-Sync with an AMD GPU? No. G-Sync Compatible works only with NVIDIA GTX 10-series and newer. AMD GPUs use FreeSync natively.
Does FreeSync work with NVIDIA graphics cards? Yes — through G-Sync Compatible mode. Enable it in NVIDIA Control Panel as described above. NVIDIA-certified G-Sync Compatible monitors are guaranteed to work; non-certified ones usually do too but aren’t officially supported.
Is a higher FreeSync tier always better? Not automatically. FreeSync Premium and Premium Pro are better specifications on paper, but the actual VRR floor (minimum Hz) printed in the spec sheet matters more than the tier badge for gaming smoothness. A Standard FreeSync monitor with a 1 Hz floor outperforms a Premium monitor with a 48 Hz floor for most gaming scenarios.
Should I enable G-Sync or FreeSync for competitive gaming? Yes, with V-Sync disabled. Adaptive sync eliminates tearing without adding latency. If you’re consistently hitting fps above your monitor’s max refresh rate, cap your framerate to your max refresh rate using RTSS or an in-game cap — this is better than letting V-Sync clamp your fps.
Does adaptive sync affect input lag? Adaptive sync itself adds negligible input lag — often under 1ms. This is far less than V-Sync (up to 16ms at 60 Hz). The game’s own latency pipeline matters far more. Our PC optimization guide covers everything that genuinely reduces input lag end to end.
Sources
[1] Nvidia G-SYNC VRR Technology — PCMonitors.info
[2] Is G-SYNC Worth It? — DisplayNinja
[3] Nvidia’s new partnership with MediaTek has just killed the module which made G-Sync monitors so damned expensive — PC Gamer
[4] AMD FreeSync, FreeSync Premium, and FreeSync Premium Pro: What’s the Difference? — HowToGeek
[5] FreeSync vs FreeSync Premium vs FreeSync Premium Pro — GPUMag
[6] What Is VRR Brightness Flickering And Can You Fix It? — DisplayNinja
[7] What Is HDMI VRR on the PlayStation 5 and Xbox Series X? — HowToGeek
[8] Nvidia G-SYNC Setup Documentation — NVIDIA Official (nvidia.com/content/Control-Panel-Help)
