60Hz vs. 120Hz: Can You Really Tell the Difference?


Higher-end display options in the form of 120 and 240Hz screens have become increasingly affordable and commonplace in recent years. So given a choice between a 60Hz, 120Hz, and 240Hz display—which one should you choose, and does it even matter?

TV and Monitor Jargon Explained

First, let’s get the jargon out of the way.

Hertz, abbreviated Hz, is a unit of frequency. In the context of display technology, it signals how many times your screen refreshes each second. A higher number means that new information reaches your screen faster, allowing you to respond to the stimulus quicker.

Another important metric is FPS, or frames per second.

As its name suggests, FPS measures the number of frames delivered to the display each second. Since a video is essentially a series of pictures (or frames), a higher FPS can result in a smoother experience. This is especially true in scenarios where there’s fast-paced motion or you’re manipulating objects on the screen, like gaming or scrolling through websites.

Most movies and TV are shot at 24FPS, which means you technically don’t really need a display that goes above 24Hz. Computers, however, almost universally output at 60FPS—making 60Hz the bare minimum all display manufacturers deliver these days.

60Hz vs. 120Hz: Can You Tell the Difference?

The best way to know if you can tell the difference between 60Hz and 120Hz is to compare them in quick succession. If you don’t own any high refresh rate displays yet, though, this may be impossible. Still, you can try Blur Buster’s UFO test to see the difference between 30FPS and 60FPS. Do note, however, that the jump from there to 120FPS will not necessarily be as stark.

Random blind tests have shown that the average user is likely to notice a perceptible difference—at least in gaming-related applications. A study conducted by Hardware.info all the way back in 2013 found that an overwhelming majority of gamers (nearly 9 out of 10) were able to distinguish between 60Hz and 120Hz.

In 2019, Nvidia also found a positive correlation between higher refresh rates and player performance. As a graphics hardware manufacturer, the company does have a vested interest in arriving at this conclusion. Having said that, it’s worth noting that independent tests of the same nature have found similar results.

[embedded content]

In games, it’s clear that going from a 60Hz output up to 120Hz is extremely noticeable, but going much beyond that can be difficult to distinguish. Unless you’re a professional Esports player, the chances are that you will be just as satisfied with a 120 or 144Hz display as a more expensive 240Hz one. Either one will likely be a much better experience than a 60Hz display.

60Hz vs. 120Hz: Distinguishable in Non-Gaming Scenarios?

As with any new technology, high refresh rates were extremely difficult to manufacture when they first appeared. For years, the only way to get a good high refresh rate experience was to pay a pretty premium for a top-of-the-line gaming monitor.

These days though, the manufacturing processes and technology have become widespread enough that high refresh rate displays can be found on other consumer electronics, including smartphones, laptops, and even tablets.

Apple was one of the first companies to adopt high refresh rates on mobile hardware. Its iPad Pro lineup has had 120Hz displays since 2017, under the company’s ‘ProMotion’ branding. While Apple hasn’t marketed the technology heavily outside of its press events, reviewers and consumers have both universally praised its addition. In the years since, high refresh rate displays have become ubiquitous on smartphones—even mid-range ones.

Discerning users can almost immediately notice a difference after switching to a higher refresh rate display. Smartphone reviewers have even stated that 90Hz and 120Hz displays are “integral to… a speedy user experience.”

However, not all high refresh rate experiences are created equal. While the tech is quite easy to find these days, it still requires competent hardware to deliver fluid experiences.

For instance, on the very low-end spectrum of smartphones, you’ll be unlikely to notice the high refresh rate display as much because the processor will struggle to keep up in more demanding scenarios. In these cases, you’re better off purchasing a phone equipped with a better processor.

Similarly, if your computer struggles to deliver a consistent 60FPS in games, buying a 120Hz display isn’t going to improve your experience dramatically. You’d be much better off just fixing the root cause by upgrading your graphics card, processor, or other aspects of your build.

Related: How to Fix Low Game FPS in Windows

Sony and Microsoft Bring 120Hz to the Masses

For several years, gaming consoles offered a standard 60Hz output. Even then, the vast majority of games only managed to deliver half as many frames per second.

This is because, unlike gaming PCs and enthusiast-grade hardware, consoles are often sold at slim margins or even a loss. Console manufacturers have to keep the upfront cost reasonable and affordable. As a result, they have historically shipped with limited hardware capabilities—leaving game developers to meet a baseline performance target.

Over the past few generations of consoles, most games targeted 30 FPS—unless you chose to sacrifice visual fidelity for the increased frames. Still, the most recent revisions of the PS4 and Xbox One series of consoles came close to delivering true 60 FPS output in several games.

Now, with the launch of the PS5 and Xbox Series X, both Sony and Microsoft have committed to delivering experiences that go beyond even 60Hz. Both consoles support the new HDMI 2.1 standard, which means they have enough video output bandwidth to deliver 4K resolutions at 120Hz.

Related: PS5 vs. Xbox Series X: The Battle of the Specs

Simple compatibility does not guarantee that most games will output at 120FPS on these consoles, similar to how most games did not deliver 60FPS in the past. However, a multitude of previous-gen games already run at 120FPS. This is largely due to the massive increase in hardware performance of this console generation.

If you own one of these consoles, you’re essentially leaving performance on the table by not pairing it with a high refresh rate display. If your TV or monitor is a few years old, the chances are that it does not support the latest HDMI 2.1 spec and will fallback to displaying 4K at 60Hz.

So What Display Should You Buy?

At the end of the day, the decision between a 60 and 120Hz display depends on your use-case for it. If you have a high-end gaming computer or one of the latest generation consoles, the decision is fairly straightforward. All evidence points to the fact that a 120Hz screen will result in an immediate and significant uplift to your experience.

However, for basic office tasks or web browsing, the difference will be extremely hard to spot. In these situations, you’d probably be better off purchasing a brighter or higher resolution screen instead. For movies and TV shows, on the other hand, consider a High Dynamic Range (HDR) equipped display.


Want That Cinema Look? How Dynamic Range Affects Video

What makes video shot on expensive cinema cameras so much better than what you can capture with an iPhone?

About The Author


Products You May Like

Leave a Reply

Your email address will not be published.