The TV industry is always changing; we see new and improved technologies being released every year. This may make you think that it is time to upgrade, but you need to be able to make an informed decision. Take HDR, for instance.
Over the past few years, as CRT has evolved into thinner TVs, and as we have witnessed plasma’s introduction and downfall, terms like HD, Full HD, and Ultra HD have hit the market and began confusing consumers. The latest to join this debate is HDR. This article looks at how HDR works for TVs, and sometimes smartphones, and attempts to answer the currently prevalent question: is HDR worth it?
What is HDR?
HDR stands for ‘High Dynamic Range.’ The term is also used in photography, for instance, it is a popular feature on many smartphone cameras, like the iPhone. HDR got big in 2017, and we soon saw a range of HDR formats in the market. Now, many HDR devices are available, and there are many more HDR choices for consumers to make. Throughout 2018 and 2019, HDR became even more accessible.
Benefits of Using HDR
Using HDR means you get a greater and more perceptible contrast between the brightest and darkest parts of the image or video on display. HDR utilizes the concept that your eyes can better perceive whites that are brighter and blacks that are darker than those displayed by traditional (and older) SDR (standard dynamic range) TVs. In short, HDR attempts to create an image that is closer to what is seen by the human eye itself.
HDR delivers results with more sumptuous colors, while simultaneously bringing more realism and depth. This is important because you do not lose detail in tricky settings, like a sunset. HDR can preserve the progression from dark to light in ways that SDR simply cannot. This means more fidelity in the darkness. It also means that bright points of light are rendered with more detail and color. HDR thus fulfills its aim of being a visual treat of sorts.
‘Nits’ is a unit used to measure brightness, although it is not used uniformly when it comes to watching HDR content. Nits is not so much about absolute brightness; it measures the range offered. That being said, most HDR sets have a backlight system that can give you about 1,000 nits of peak brightness, or even more in some cases. Compare this to standard HD TVs, which typically only offer around 100 nits; the level that Blu-ray and standard TV content works with.
Also, HD TVs offer Rec. 709, or BT.709; which is an 8-bit video specification. HDR upgrades this to 10- or 12-bit Rec. 2020, or BT.2020; a specification that is able to represent up to 60 times more color combinations than HD TVs, and those with smoother shade gradations. Take these numbers with a grain of salt though; they do not mean much by themselves because they are just a standard that has been defined by the ITU (International Telecommunications Union).
Another thing worth mentioning is that HDR is not linked to resolution, so you will find HDR-enabled TVs that are 1080p rather than 2160p and claim to be full HD. Similarly, there are phones and tablets with HDR displays offered at a wide range of resolutions.
When it comes to the kind of cables that HDR TVs use, there is nothing special that you will need. However, if you are watching Ultra HD Blu-ray, then we recommend using an 18Gbps (or any other high speed) HDMI cable. But unless you have really old HDMI cables, you are probably covered already. The HDMI you use does not have to be too expensive cable either. For instance, the basic ones available on Amazon work just fine.
The two popular HDR standards out there right now, and hence the ones that you need to be familiar with, are HDR10+ and Dolby Vision. Let’s look at how they compare.
HDR10 is the generic form of HDR, which means that it really just refers to the baseline specifications for HDR content. It uses static metadata and is the standard used by the Xbox One S and PS4 Pro, with an HDR firmware update available for all previous PS4 models.
In short, HDR10 is a 10-bit video stream that will be supported on most HDR-compatible devices. It is also a part of the specification defined by the Blu-ray Disc Association for Ultra HD Blu-rays. Moreover, it is HDR10 support that is included in any Ultra HD Premium certification.
Dolby Vision is an alternative (and somewhat better, depending on the user’s expectations) HDR standard. What makes it different is that it is designed for an end-to-end HDR process; it uses dynamic metadata. This means that it tells the display device how bright it needs to be, but unlike HDR10, which provides only one value to the device, Dolby Vision does this for every frame.
It can hence deliver 12-bit color depth, and it supports a backlight system that is four times more powerful than that of standard HDR TV sets. Since a Dolby Vision decoder does not require only Dolby Vision HDR content, you can also use it to view HDR10 content.
Is HDR Really Worth It?
Now you know everything there is to know about HDR, but whether or not investing in it is worth it will depend not only on your unique needs but on certain technicalities as well. This includes things such as the fact that you need to have a display that’s compatible with HDR in order to view HDR content. You can either buy a 4k Blu-ray player or stream HDR content from the likes of Netflix and Amazon.
Overall, it is worthwhile to research about any product you are looking to buy before you actually buy it, especially when it comes to TVs. Remember though, that the introduction of an official HDR standard in the form of Ultra HD Premium has minimized the danger of buying a rubbish TV that claims to be HDR compatible, so that is one less thing for you to worry about!