Open your TV’s game mode, click through a gaming monitor’s OSD, or scroll through your console’s display settings — at some point you’ll see the toggle: Variable Refresh Rate (VRR). You enable it, something improves, but the exact mechanism stays a mystery.
VRR solves two specific problems: screen tearing and judder. Tearing is that horizontal split-screen glitch where the top half of your game shows one frame and the bottom half shows the next — because your GPU finished rendering at a moment your display wasn’t ready. Judder is the reverse: your display refreshes on schedule but the GPU isn’t done, so the previous frame stays on screen an extra beat, creating a subtle stutter.
VRR fixes both by making your display follow your GPU rather than its own internal clock. This guide covers the exact mechanism, maps out which VRR standard applies to your setup, and identifies the one scenario where turning it off is the right call.
How VRR Actually Works
A standard display refreshes at a fixed rate — 60Hz, 144Hz, 240Hz — on an internal clock that ticks whether your GPU is ready or not. If a new frame arrives between ticks, you get tearing. If the clock ticks and no new frame has arrived, the old frame is held for another cycle, producing judder.
Getting the right settings makes a big difference — see refresh rate explained for the optimal config.
VRR replaces that fixed clock with a frame-timing handshake. Your GPU sends a signal with every completed frame. Instead of refreshing on its own schedule, the display waits for that signal — the moment a frame arrives, the screen refreshes. Whether the next frame comes 4ms or 14ms later, the display adapts.
The result: your display’s refresh rate tracks the GPU’s output exactly, frame by frame.
Every VRR display operates within a defined range — typically 40–144Hz on a 144Hz monitor, or 48–240Hz on a 240Hz panel. Inside that range, synchronisation is seamless. If your frame rate drops below the floor (say, 35 FPS on a monitor with a 40Hz minimum), Low Framerate Compensation (LFC) activates: the display shows each frame twice, effectively doubling to 70Hz, keeping the VRR sync active rather than dropping out entirely.
One important point: VRR doesn’t make your GPU faster. It makes your display wait for whatever the GPU produces. A demanding game still needs a powerful GPU — VRR eliminates the visual artefacts caused by mismatched timing, not the underlying performance gap.

The Three VRR Standards — Which One Do You Have?
VRR isn’t one technology — it’s three overlapping standards sharing the same core idea: display follows GPU. Knowing which one your monitor or TV supports tells you exactly what hardware and cables you need.
| Standard | Created by | Works with | Connection needed |
|---|---|---|---|
| G-Sync | NVIDIA | NVIDIA GPUs only | DisplayPort |
| G-Sync Compatible | NVIDIA | NVIDIA GPUs + most FreeSync monitors | DisplayPort or HDMI |
| FreeSync / Adaptive-Sync | AMD / VESA | AMD GPUs + NVIDIA (since 2019) | DisplayPort or HDMI |
| HDMI 2.1 VRR | HDMI Forum | PS5, Xbox Series X|S, modern PCs | HDMI 2.1 only |
The biggest practical shift for PC gamers came in 2019: NVIDIA adopted VESA Adaptive-Sync — the open standard behind FreeSync — under the “G-Sync Compatible” label. Most FreeSync monitors now work with NVIDIA cards for adaptive sync without needing the dedicated hardware module inside certified G-Sync monitors. In 2026, GPU brand should not restrict your monitor choice.
Console gamers use HDMI 2.1 VRR, the HDMI Forum’s implementation. Both PS5 and Xbox Series X|S support it, but with a key practical difference: Xbox Series X outputs at up to 120Hz system-wide with VRR active, covering any game regardless of developer support. The PS5 requires per-game developer implementation — if the game’s code doesn’t support VRR, you don’t get it, even on a VRR-capable TV.
Should You Enable VRR? A Player-Type Guide
VRR’s benefit varies significantly by how you play. Here’s the decision broken down by player type — with genuinely different recommendations, not the same advice re-labelled.
| Player Type | Recommendation | Reason |
|---|---|---|
| Casual / story gamer | Always on | Frame rates fluctuate constantly in story games — VRR eliminates tearing during every dip |
| Open-world / RPG player | Always on | Dense scenes cause the largest FPS swings; VRR’s benefit is most visible here |
| Competitive / esports player | Test both | If you consistently hit 200+ FPS on a 144Hz monitor, VRR overhead may matter — see section below |
| Console gamer (Xbox Series X) | Enable in system settings | Xbox outputs 120Hz system-wide with VRR — enable once in console settings, applies to all games |
| Console gamer (PS5) | Enable per game | PS5 requires per-game developer support — check if your specific title lists VRR in its features |
Verified April 2026. Console VRR behaviour may change with system software updates — check your console’s display settings for the latest options.
When to Turn VRR Off
For most players, VRR should stay permanently enabled. There’s one genuine exception: competitive gaming at consistently very high frame rates.
The logic: if your PC runs Counter-Strike 2 at 300 FPS on a 144Hz monitor, your GPU is already delivering frames faster than the display can refresh — VRR isn’t synchronising anything meaningful because you’re well above the monitor’s ceiling. In that situation, the VRR handshake protocol adds a fractional processing overhead for zero visual benefit. At the elite competitive level, even fractions of a millisecond matter.
The practical threshold: if you consistently run 50% or more above your monitor’s refresh ceiling (220+ FPS on 144Hz, 360+ FPS on 240Hz), disabling VRR and running V-Sync off gives you the lowest possible input lag. Below that threshold — or if frame rates fluctuate at all — VRR delivers a net advantage every time.
For a full breakdown of every display and GPU setting that affects gaming performance, the PC optimisation and game settings hub covers FPS caps, resolution, and sync settings in one place.
Frequently Asked Questions
Does VRR reduce input lag?
Generally yes — and here’s the mechanism. Preventing screen tearing on a fixed-refresh display requires V-Sync, which forces the GPU to delay each frame until the display’s next clock tick. That wait adds input lag. VRR eliminates it: frames go to the display the instant they’re rendered. The result is lower average input lag than V-Sync and smoother output than V-Sync off. The exception is the competitive scenario above, where your frame rate stays well above the monitor’s ceiling and VRR’s overhead can exceed its benefit.
Can I use FreeSync with an NVIDIA GPU?
Yes. Since 2019, most FreeSync monitors work with NVIDIA GPUs via G-Sync Compatible mode. Enable it in NVIDIA Control Panel under “Set up G-SYNC.” You need an NVIDIA GPU from the GTX 10 series or newer and a DisplayPort or HDMI connection — no G-Sync certified monitor required. The Game Settings Explained guide covers which NVIDIA Control Panel options to configure alongside VRR.
What’s the difference between VRR and V-Sync?
V-Sync locks frame output to the display’s fixed refresh cycle — tear-free, but with input lag and judder when the GPU drops below the target frame rate. VRR reverses the relationship: the display adapts to the GPU’s output rather than forcing the GPU to wait for the display. The result is tear-free visuals without V-Sync’s input lag penalty.
Sources
- HDMI Forum. “Variable Refresh Rate (VRR).” hdmi.org.
- Wikipedia contributors. “Variable refresh rate.” en.wikipedia.org.
- ViewSonic. “Why VRR Matters: A Gamer’s Guide to Variable Refresh Rate Monitors.” viewsonic.com.
- BenQ ZOWIE. “Variable Refresh Rate (VRR) & how does it work?” zowie.benq.com.
I've been playing video games for over 20 years, spanning everything from early PC titles to modern open-world games. I started Switchblade Gaming to publish the kind of accurate, well-researched guides I always wanted to find — built on primary sources, tested in-game, and kept up to date after patches. I currently focus on Minecraft and Pokémon GO.
