High dynamic range (HDR) video is taking off. Some of your favorite movies are already available with enhanced color and brightness, and look even better than their original theatrical releases.
But some remasters have caused critics to cry foul, sparking a debate over technical ability and artistic intent.
What are the advantages of HDR?
Before considering whether the term “fake HDR” is even justified, it’s important to understand what HDR video is. As the name suggests, high dynamic range video has increased dynamic range compared to standard dynamic range (SDR) content.
Dynamic range is the amount of information visible in an image or video between the brightest highlights and the deepest shadows. Modern HDR video is delivered in 10 bits per channel, as opposed to eight bits per channel in SDR. This means SDR can display 256 shades of red, but HDR can display 1024.
This means that more color information is visible on the screen which is closer to what we would see in real life. More shades of a particular color also makes the unsightly “stripes” in gradients less noticeable. The difference is most noticeable in fine details, such as clouds or areas with subtle variations in color.
HDR also adds maximum luminance or brightness. The vast majority of HDR compatible televisions come with basic standard HDR10 integrated. It states that content should be mastered at 1,000 nits, as opposed to the traditional 100 nits (recently revised to around 200) for standard definition content.
This means that bright objects, like the sun, a flashlight, or gunshots, can really show up when viewed on an HDR-compatible display. The extra brightness makes items like these much closer to what they would be like in real life, creating a more immersive viewing experience.
HDR video is something you have to see to really appreciate, but its improvement over SDR can be huge.
What is “fake HDR”?
The term “Fake HDR” has been thrown around on YouTube, Reddit, and other platforms following a few high-profile Blu-ray releases. This refers to the reluctance of studios to rate their HDR productions at sufficient maximum brightness and make images stand out.
According to Vincent teoh, a professional display calibrator and reviewer, Star Wars: The Last Jedi’s 4K Blu-ray hits a peak brightness of 250 nits, with the sun rated at just 200.
Teoh also found that the Blade Runner 2049 4K Blu-ray barely exceeds 200 nits, making it “an SDR movie in an HDR container”.
These HDR versions use 10-bit color depth (12 in some cases). This means that they still provide a higher quality picture than SDR. However, since they do not have the peak brightness flashes shown in many other productions, some perceive these versions as “fake HDR”.
As another benchmark, a super bright LCD display, like the Vizio P-Series Quantum X, can achieve a peak brightness of over 2000 nits. Even LG relatively “dark” OLED panels handle around 700 nits. Some Blu-ray critics and collectors believe that these “fake HDRs” were hampered by disappointing peak brightness.
This doesn’t mean a movie looks bad; the image does not “jump” from the screen as in other versions. Since these are major releases from some of Hollywood’s biggest studios, it’s clear that colorists and directors know exactly what they’re doing. The reluctance to splash HDR effects is intentional.
Whether this validates the term “fake HDR”, however, remains a matter of opinion. The Blu-ray packaging doesn’t include any information about maximum luminance, and most buyers wouldn’t understand the terminology anyway.
So moviegoers have to rely on critics like Teoh, who have access to HDR mastering tools, to get the whole story.
HDR standards and creative intent
Two factors have contributed to the situation discussed above: the technical limitations of modern screens and the creative intention.
HDR video has yet to be significantly standardized. The closest thing to a basic standard is HDR10, which now enjoys good support from TV manufacturers and movie studios. While the standard HDR10 is meant to be mastered at peak 1000 nits brightness, not all TVs can achieve these levels.
A screen that cannot hit those high targets will tone an image that is beyond its capabilities. However, bright elements will still have an impact, thanks to the contrast between highlights and shadows. However, directors also rely on a display’s ability to map tones correctly, which adds an element of risk. Will every screen do the trick?
The alternative is to rate your movie so that it does not exceed the capabilities of most screens. An image classified more conservatively, with light elements limited to 200 or 300 nits, will appear less punchy and vibrant. The result is that you will get a fairly consistent image across a wide range of screens.
The Wild West of HDR standards has also created a format war between competing technologies, such as Dolby Vision and HDR10 +. These modern HDR standards use dynamic metadata to help TVs adjust on a per scene or frame-by-frame basis. The old standard HDR10 doesn’t have dynamic metadata, however, your TV just has to decide for itself.
Then there is the issue of creative intention. Some directors may decide that they don’t like HDR or instead use HDR to dazzle viewers with highlights. The benefits of HDR for these professionals lie in the volume and color accuracy, not the additional luminance offered by the latest televisions. It should be noted, however, that many directors use HDR and peak brightness to their fullest extent.
However, it’s hard to argue with someone’s creative vision. Black and white films were still produced long after color became the norm. Some directors still shoot on 35mm film or in a 4: 3 aspect ratio.
Are these decisions wrong? Are viewers wrong to wonder what a movie would look like if it had been shot with all the technical bells and whistles available at the time of its making?
Food for thought, indeed!
Movies that are definitely HDR
If a movie has been released on Blu-ray in HDR10, Dolby Vision, or a competing format, that’s about as good as you can get until the studio decides it’s time to remaster it. If you are upgrading from regular DVDs or Blu-rays, the jump in 4K and a wider color gamut is always a good incentive.
Choosing your favorite movies according to their technical specifications is like choosing your favorite books according to the typeface. It can certainly impact the overall layout, but the underlying story, dialogue, and other elements remain the same and are just as enjoyable.
If you buy Blu-rays for their HDR capabilities, you might want to save your money and just steer clear of the ones that don’t meet your expectations. Unfortunately, there aren’t many people who have access to the professional tools Teoh uses, so the information is at this point.
For now, you’ll just have to watch the “good” HDR productions, like Mad Max Fury Road (nearly 10,000 nits), The Greatest Showman (over 1,500 nits) and Mulan on Disney. More (over 900 nits).
Are you looking for a new TV on which to watch your HDR movies? Watch out for these six common mistakes.