A monitor’s refresh rate is an important specification you should pay close attention to if you’re buying a monitor for PC gaming or use with a modern console. This is especially true if you’re a competitive gamer looking for an edge.
What Does “Refresh Rate” Mean?
The term “refresh rate” is used to describe how many times a monitor updates in a single second. This is measured in hertz (Hz), with most regular monitors designed for office use having a refresh rate of 60Hz, though higher refresh rates are becoming more common.
All displays use this metric, whether or not you see it quoted on the box. This includes smartphones and tablets, most of which use 60Hz displays. Manufacturers are quick to point out higher refresh rate models that use 90Hz displays (like Google’s Pixel 5), though some manufacturers like Apple hide this number behind marketing terms like “ProMotion” which is used to describe the iPad Pro’s 120Hz display.
Even televisions now feature higher refresh rates thanks to a push for 120Hz gaming from Microsoft’s Xbox Series consoles and Sony’s PlayStation 5. These gaming machines use the ample bandwidth provided by the HDMI 2.1 standard to run some games at 4K with HDR in 120 Hz mode.
What Qualifies as a “High” Refresh Rate?
A standard desktop monitor, budget smartphone, or entry-level television will have a refresh rate of around 60 to 75Hz. This is fine for most activities, including browsing the web, swiping through social media, or playing games in a non-competitive setting.
Generally speaking, anything above 120 Hz qualifies as a “high” refresh rate display, since this is higher than the established standard of 60Hz. There is no hard definition for what qualifies as “high” and some may interpret this differently.
120Hz gaming has been thrust into the spotlight with the arrival of a new generation of consoles in 2020. The majority of televisions being manufactured around launch are still shipped with 60Hz panels but expect to see more models shipping with panels that flicker at 120Hz (and HDMI 2.1 ports that are necessary for 4K gaming at higher refresh rates).
The next step up for PC gamers is 144Hz monitors. The question as to why 144Hz is the magic number has many theories, including marketing, the fact that 144Hz is a multiple of 24 (with 24p being the cinematic frame rate), and bandwidth limitations of the DVI connection. Many 144Hz monitors can be “overclocked” to 165Hz by simply forcing the refresh rate under display settings.
At the high-end are 240Hz and 360Hz monitors like the ASUS ROG Swift PG259QN. At this stage, many gamers can’t tell the difference between the two, though lower latency at the higher end may be beneficial.
High Frame Rates Require High Refresh Rates
Since a monitor’s refresh rate determines how many times a refresh occurs every second, the refresh rate of a monitor is closely tied to frame rate (measured in frames per second or fps). If you are playing a game at 120fps on a 60Hz monitor, your display is only able to show you half of the frames your GPU is producing.
For high frame rates to be “worth it” you’ll need a display that can keep up with your GPU, and that means buying a display with a high refresh rate. If your computer isn’t able to churn out high refresh rates in the games you play, your purchase of a high refresh rate monitor for gaming might not be worth it.
Many gamers turn down graphical settings including resolution, texture quality, and post-processing effects like antialiasing to get the best frame rate possible. This is especially true in competitive gaming circles, where higher frame rates may yield an advantage over the competition.
Since higher refresh rates normally command higher price tags, many gamers opt for smaller 24-inch and 27-inch displays to keep the price down. Many of these monitors don’t exceed 1080p or 1440p in terms of resolution, though if you have a large budget you can get your hands on ultrawide 240Hz monitors like the Samsung Odyssey G9.
Higher Refresh Rates Mean a More Responsive Screen
A monitor that refreshes at 60Hz is capable of displaying a new image every 1/60 of a second. If you double the refresh rate, you can produce a new image every 1/120 of a second. This relies on your computer or console’s ability to deliver a consistent frame rate, of course.
Higher frame rates mean lower frame times (or the time it takes to display a new frame). A 60Hz monitor running at 60fps will display a new frame every 16.667 milliseconds (this is because there are 1000 milliseconds in a second, and 1000/60=16.667). A 120Hz monitor running at 120fps cuts this in half, with a new frame every 8.333 milliseconds.
Doubling the visible frame rate and halving the frame time has a perceivable difference in terms of how smooth the action appears on-screen. Indeed, not everyone can immediately see or feel the benefit, but most people do notice it when they go back to a 60Hz monitor, especially after playing at 144Hz or beyond.
Imagine you’re playing a competitive shooter. You get feedback on what’s happening on-screen every 1/60 of a second, including any actions you or your competitors make. You also have your monitor’s response time to factor in, which could be a few milliseconds. A 240Hz monitor could theoretically deliver four times as many frames every single second, providing you with more feedback about what’s happening and a smoother playing experience to boot.
The Linus Tech Tips YouTube channel took a look at this phenomenon in their video on the 240Hz rate’s effect on gaming.
There are of course other factors, like how long it takes your computer to process your input and how quickly your GPU can get a new frame ready. The monitor’s refresh rate is just one part of the equation, but it’s also one of the easiest changes you can make in terms of improving the player’s experience.
This is why competitive gamers are so keen to max out their frame rates, even at the expense of graphical fidelity. The more feedback you receive and the more fluid your actions appear on-screen, the better.
Of course, this doesn’t just affect gaming, everything feels better at higher refresh rates. Even dragging windows around your desktop or scrubbing through a timeline in a video editor will be noticeably smoother, with less “wobble” and flicker.
Variable Refresh Rate Is Now Standard
Variable refresh rate (VRR) technology like NVIDIA’s G-SYNC, AMD’s FreeSync, and the HDMI 2.1 VRR standard was developed to eliminate screen tearing. Tearing occurs when the GPU isn’t able to draw a frame within the required frame time, so a half-frame is sent instead. This means half of the old frame persists on-screen, resulting in an unsightly tear.
By instructing the monitor to wait (and duplicate frames if necessary), half frames are never sent, and tearing no longer occurs. Fortunately, variable refresh rate technology is now standard on the vast majority of monitors, whether they support high refresh rates or not.
VRR works in conjunction with target refresh rates like 120Hz or 240Hz by adjusting the refresh rate on the fly. Make sure you match the VRR technology in your monitor to your graphics card’s capabilities to avoid disappointment.
Choosing a High Refresh Rate Monitor
You should match your monitor’s refresh rate to your computer’s performance. Unless you’re planning on upgrading your computer soon, buying a monitor with a refresh rate that your computer will never achieve could be a waste of money (unless you live for the silky smooth desktop interface).