Display technology has come a long way in a decade. If you want a television for video games, including a next-gen console and PC titles, your needs are quite different from the average shopper.
The importance of HDMI 2.1
The next generation of high-end PC consoles and graphics cards have arrived. Sony and Microsoft are battling the PlayStation 5 and Xbox Series X, both of which feature HDMI 2.1 ports. NVIDIA also launched his 30 record series cards with full HDMI 2.1 support.
So what’s the deal with this new standard?
The High Definition Multimedia Interface (HDMI) allows your TV to connect to consoles, Blu-ray players, and many PC graphics cards. HDMI 2.0b tops out at 18 Gbps bandwidth, which is sufficient for 4K content at 60 frames per second.
HDMI licensing authority
HDMI 2.1 allows speeds of up to 48 Gbits per second. This includes support for 4K at 120 fps (with HDR) or 8K at 60 fps. It also supports uncompressed audio and a host of other features, such as variable refresh rates (VRR) and low latency automatic mode (ALLM) to minimize input lag.
Keep in mind, however, that HDMI 2.1 is only worthwhile if a TV has a 120Hz panel. Some TVs, like the Samsung Q60T, advertise support for HDMI 2.1, but only have a 60Hz panel. That means they can’t enjoy 120 frames per second because the screen only has 60 frames per second.
Do you need all that extra bandwidth? If you want to get the most out of the new consoles, you do. However, it’s unclear how many next-gen games will support 120-frame 4K resolution. Microsoft has announced that a handful of Xbox Series X titles will support 4K / 120. The list includes the multiplayer component of Halo Infinite (delayed until 2021), the eye-candy platform game. Ori and the will of the wisps, Dirt 5 and Gears of War 5.
Most games from the previous generation were running at 30fps, including the big-budget first-party versions, like The Last of Us Part II, and third-party mainstays, like Assassin’s Creed. Microsoft improved this with the Xbox one x by optimizing some games to run at 60 frames, instead.
Both PS5 and Xbox Series X will target 4K 60 frames as a baseline. If you want to stand the test of time, buy a 120Hz display with HDMI 2.1 compatibility, even if the ports cap at 40Gbits per second (like some 2020 LG and Sony TVs and receivers). 40 Gbits per second is sufficient for a 4K signal at 120 frames with full 10-bit HDR support.
Even NVIDIA has unlocked 10-bit support on its 30 series cards. This allows displays with 40 Gbits per second to handle 120 images at 4K with 10 bits without chroma downsampling (i.e. some channels omit some color information).
If you’re hanging out on your PlayStation 4 or Xbox One for a bit, or don’t need 120 fps gameplay, HDMI 2.0b is fine for now. It’s also great if you get the cheaper Xbox Series S, which targets 1440p, rather than full 4K.
Over the next few years, more and more models will support HDMI 2.1, and you’ll have more options, which means more opportunities to save money.
Variable refresh rate, automatic low latency mode and fast image transport
Some of the new HDMI 2.1 features are also available through the older HDMI 2.0b standard and have been implemented on TVs that do not explicitly support HDMI 2.1.
Variable refresh rate (VRR or HDMI VRR) is a technology that rivals NVIDIA G-Sync and AMD FreeSync. While the latter are primarily intended for PC gamers, HDMI VRR is intended for consoles. Currently, only Microsoft has committed to using this feature in the Xbox Series X and S, but the PlayStation 5 is expected to support it as well.
HDMI licensing authority
VRR is designed to prevent screen tearing, which is an unsightly side effect of a console that cannot keep up with the screen refresh rate. If the console is not ready to send a full image, it sends a partial image instead, causing a “tear-off” effect. When the refresh rate is in unison with the frame rate, tearing is virtually eliminated.
Auto Low Latency (ALLM) mode is a smart method of turning off processing to reduce latency when playing games. When the TV detects ALLM, it automatically turns off features that may introduce latency. With ALLM, you don’t have to remember to switch to Game Mode to get the best performance.
Quick Frame Transport (QFT) works with VRR and ALLM to further reduce latency and screen tearing. QFT carries the source images at a higher rate than existing HDMI technology. It makes the games more responsive.
All devices in the HDMI chain must be supported for these features to work, including AV receivers.
Let’s talk about latency
Latency is the time it takes for the display to react to your input. For example, if you press a button to jump on the controller, the latency is the time it takes for your character to jump to the screen. Lower latency can give you an edge in competitive multiplayer games or make fast single-player games more responsive.
This delay is measured in milliseconds. In general, a latency of 15 ms or less is imperceptible. Some high-end TVs reduce this to around 10ms, but anything below 25ms is usually good enough. The importance of this depends entirely on the type of games you play.
Response time refers to the response of the pixels. This is the time it takes for a pixel to switch from one color to another, typically shown in “gray to gray” performance. It’s also measured in milliseconds, and it’s not unusual for high-end displays to have a pixel response of 1ms or better. OLED displays, in particular, have an almost instantaneous response time.
Many high-end and flagship TVs have good latency and response times. Budget TVs can be hit or miss, so be sure to do your research before you buy. At RTINGS, they test latency and list all exam models by input offset if you want to see how the one you are considering compares.
FreeSync and G-Sync
Variable refresh rates eliminate screen tearing by matching the refresh rate of the monitor to the frame rate of the source. On a PC, it’s a graphics card or GPU. Nvidia and AMD have proprietary technologies that address this problem.
G-Sync is Nvidia’s variable refresh rate technology, and it requires a hardware chip on the screen. This only works with Nvidia graphics cards, however. If you have an Nvidia GTX or RTX card that you want to use with your new TV, just make sure it supports G-Sync.
There are currently the following three levels of G-Sync:
G-Sync: Provides tear-free play in standard definition.
G-Sync Ultimate: Designed for use with HDR brightness up to 1000 nits.
Compatible with G-Sync: These are screens that don’t have the required chip, but still work with standard G-Sync.
FreeSync is AMD’s equivalent technology and works with the AMD Radeon family of GPUs. There are three levels of FreeSync, a well:
FreeSync: Removes screen tearing.
FreeSync Premium: Incorporates low frame rate compensation to increase low frame rates. It requires a 120Hz display at 1080p or better.
FreeSync Premium Pro: Adds support for HDR content up to 400 nits.
Many TVs that support G-Sync will also work with FreeSync (and vice versa). Currently, very few TVs explicitly support G-Sync, including LG’s flagship OLED lineup. FreeSync is cheaper to implement because it doesn’t require any additional hardware, so it’s widely used on more affordable displays.
Since AMD makes the GPUs on both the Xbox Series X / S and the PlayStation 5, FreeSync support might be more important for console gamers of this generation. Microsoft has confirmed FreeSync Premium Pro support for the upcoming X Series (in addition to HDMI VRR), but it’s unclear what Sony is using.
Think about where you will be playing
There are currently two main types of panels on the market: LED-lit LCD screens (including QLEDs) and self-lit OLEDs. LCD panels can get much brighter than OLEDs because OLED is a self-emitting organic technology that is more sensitive to permanent image retention at high brightness.
If you are going to perform in a very bright room, you may find that an OLED is just not bright enough. Most OLED panels are subject to Automatic Brightness Limiting (ABL), which reduces the overall screen brightness in well-lit scenes. LCD panels are not sensitive to this and can get much brighter.
If you mainly play during the day in a room full of windows with lots of ambient lighting, an LCD screen might be the best choice. However, in a light controlled room at night with subtle lighting, an OLED will give you the best picture quality.
Typically, OLEDs provide excellent picture quality due to their (theoretically) infinite contrast ratio. QLED models (LED illuminated LCD screens with quantum dot film) have a higher color volume, which means they can display more colors and become brighter. His up to you to decide that best suits your budget and gaming environment.
OLED panels are likely to retain a permanent image, or “burn. This is due to static content, such as dashboards or TV channel logos, remaining on screen for an extended period of time. For gamers, this also applies to HUD items, such as health bars and mini-maps.
For most people, this won’t be a problem. If you change your content consumption and exhaust the panel, you probably won’t experience a burn. Also, if you play various games, it won’t be much of a problem.
For whom this might be a problem are the people who play the same game for months, especially if it is heavy on HUD elements. One way to reduce your risk of burns is to turn on HUD transparency or turn the HUD off completely. Of course, this is not always possible or desirable.
Many OLED TVs now include burn reduction measures, like LG’s Logo Luminance feature, which darkens the screen when static content is displayed for two minutes or more. This should help keep burn-in at bay.
For PC gamers who use a TV as their monitor (with taskbars and desktop icons on the screen), an OLED probably isn’t the best choice. Any static image presents a risk of burning. Unless you’re using a screen just for playing games or watching movies, you might want to consider a high-end LCD screen instead.
Not all burn-ins are noticeable during actual use. Many people only find out when they run test models, including color slides. Unfortunately, most warranties, especially manufacturers’ warranties, do not cover burn-in. If you’re concerned and still want an OLED, consider getting an extended warranty from a store like Best Buy that explicitly covers this issue.
HDR, the HDR and Dolby Vision gaming interest group
HDR gaming is set to become mainstream with the release of the PlayStation 5 and Xbox Series X / S. With both platforms supporting HDR in one form or another, you’ll want to make sure your next one TV is at least HDR10 compatible, in order to obtain richer, brighter and more detailed images.
The HDR Gaming Interest Group (HGIG) was formed with the goal of standardizing HDR gaming through the HGIG format. Games must be certified for HGIG support. The format is expected to take off with the arrival of next-gen games, so it’s probably worth looking for an HDR TV that supports HGIG.
Xbox Series X and S will also support Dolby Vision HDR, which is yet another format. Unlike HDR10, which uses static metadata, Dolby Vision uses dynamic scene-by-scene metadata. Currently, content mastered in Dolby Vision can reach a maximum brightness of 4,000 nits, although no mainstream display can yet reach these levels.
To use Dolby Vision on your new Xbox, you must have a compatible TV. Manufacturers like LG, Vizio, HiSense, and TCL all produce TVs with Dolby Vision support. Samsung, however, avoided the format in favor of HDR10 +. However, if you’re buying a next-gen Xbox, keep in mind that games will need to explicitly support this feature.
The next generation of games
It’s been an eventful year for most, so the arrival of next-gen consoles and graphics cards is even more exciting than usual. Now is not a bad time to upgrade your TV either, especially if you’ve delayed a 4K TV until now.
The price of LG OLEDs has dropped significantly in recent years. Quantum dot movies are now found in budget $ 700 LCD sets, which means you can have a bright, colorful picture without spending thousands.
Soon there will be even more price cuts, 120 Hz panels, mini LED televisionsand the widespread adoption of HDMI 2.1.