What is HDR?

What is HDR?

High Dynamic Range, or HDR, is one of the most exciting display technologies to be realized in recent years. Alongside higher refresh rates and resolutions, it's a truly beautifying technique that can have a dramatic effect on how the TV, movies, and games you watch and play, look and feel. Where the traditional dynamic range of a TV’s brightness and contrast can describe the extreme capabilities of what it can display, high dynamic range expands those extremes to give TVs and monitors access to much more vibrant colors, much starker whites, and much darker blacks.

At it's most basic level, HDR just makes things look better. The bright areas of the screen look brighter, the darker areas darker, and the colorful areas more vibrant. HDR-supporting displays can display richer reds, greens and blues, with more subtleties in between the different colors. But this doesn’t just mean shadowed areas are extra dark, and whiter areas are extra bright. It means more detail can be discerned in those areas, too.

All of this means transmitting a lot more data from the source material and down the length of the cable. That’s why the latest cable standards, like HDMI 2.1 and DisplayPort 2.0/2.1 are rated to transmit HDR data alongside high resolution and refresh rate signals, while older cable designs cannot.

You’ll also find support for newer, more advanced HDR standards mostly on high-end TVs and monitors, although some of those are filtering through to more affordable TVs and monitors as the technology matures.

What Does HDR Mean?

HDR stands for High Dynamic Range and is distinct from standard dynamic range, or SDR, from its use of a wider gamma curve. That's the information embedded in an image or video for how bright it should be at any given point on the display.

SDR's conventional gamma curve is based on the limits of cathode ray tube technology, which was the mainstay display technology used throughout much of the 20th century. With newer display technologies now powering the most capable of monitors and televisions, high dynamic range, or HDR, is possible.

What is HDR?

Like the term "HD," HDR has its own liberal and colloquial uses which make the understanding of what is HDR in relation to real-life displays, a little tricky to nail down. To fix that, organizations came up with their own standards, certifying displays as supporting HDR. In almost all cases, it has to do with the ability to display a wide gamut of colors and to reach a high brightness level, and contrast ratio, so that colors and whites can really pop, and blacks can seem deep and inky.

What is HDR and what does HDR mean for you? It depends on who you ask.

VESA 400 through to VESA 1000 certify that a display can handle HDR with a peak brightness the same as the nomenclature suggests, while HDR10 and Dolby Vision, have slightly different specifications. HDR10 specifies the 10-bit color, as well as high brightness and strong contrast, where Dolby Vision can handle up to 12-bit color, for a significantly broader color spectrum support.

A successor to HDR10, HDR10+ also implements dynamic metadata. Both it and Dolby Vision can adjust the strength of HDR's color and contrast enhancement on a frame-by-frame basis, making for a much richer image, no matter what it is.

What HDR Formats Are There?

There are four main HDR formats that are used in modern film and television, and are supported by a wide range of content and compatible displays:

HDR10 – One of the older and simpler HDR standards, HDR10 is an open standard that provides static HDR metadata, so the HDR parameters of the content cannot be changed scene by scene in the same way as more modern HDR standards. It can still deliver excellent brightness and more vibrant colors than SDR, though it is not backward compatible with SDR content.

Dolby Vision – The first commercial HDR standard, Dolby Vision was launched in 2014 and offers dynamic HDR metadata, so it can provide new parameters on a scene-by-scene basis. This allows a display to adjust the brightness or contrast of a scene as defined by the creators of the film or TV show it’s displaying, making for a much more nuanced HDR. A more advanced version, called Dolby Vision IQ, debuted in 2020 and also accounts for ambient light in the adjustments made to the on-screen picture. Although advanced, Dolby Vision is proprietary, so there are some companies, like Samsung, that don’t support it at all.

HDR10+ -- Designed by Amazon and Samsung as a successor to HDR10, and more of a competitor for Dolby Vision, HDR10+ offers dynamic, scene-by-scene HDR metadata for much more nuanced and expansive HDR content. As an open, royalty-free standard, HDR10+ has been used as the default HDR format used by HDMI 2.1 cable connections, although on compatible displays and source material, Dolby Vision can still be selected instead.

HLG – Hybrid log-gamma, or HLG, is an HDR format developed by the British broadcasting agency, the BBC, and Japan’s NHK, and is designed to bring HDR metadata to broadcast TV. It’s backward compatible with SDR UHDTV, but it won’t work with older SDR displays. It’s used by streaming services like BBC iPlayer, YouTube, and Freeview Play.

Mostly, the HDR standards have relatively the same specifications, with similar support for various resolutions, color bit depths, and brightnesses. However, you’ll be able to notice a greater effect from them on displays which have the raw power to leverage it. Displays with higher brightness and greater contrast will be able to display much more vibrant colors and higher contrast between light and dark, than TVs with more limited brightness.

TVs that use Mini-LEDs and lots of local dimming zones, or OLED TVs and monitors which are able to turn off individual pixels, will also avoid the blooming effect that can occur on some displays with particularly stark contrasting areas of a scene.

What Do You Need For HDR?

HDR is another bandwidth-consuming component of a video signal, just like resolution, refresh rate, and chroma subsampling. Increasing one component will likely require you to decrease the quality of another component, especially when you’re pushing a high-resolution 4K monitor. See the chart below for an illustration of how video quality relates to bandwidth, and how that relates to what standard of HDMI or DisplayPort you’ll need. Notice how a 4K 30Hz HDR10 signal at 4:4:4 subsampling requires the same bandwidth as a 4K 60Hz HDR10 signal at 4:2:0 subsampling. In this case, you’re trading refresh rate off for color. At the extreme end with 4K 60Hz and HDR10 or Dolby Vision, you’ll need one of the more advanced HDMI or DisplayPort cables – though the latest HDMI 2.1 and DisplayPort 2.0/2.1 cables can handle it all with ease.

What is HDR?

To watch or play HDR content, you need several key pieces of supporting equipment. First and foremost, an HDR display is a necessity, as it's what makes it possible to see the benefits of HDR technology. You also need an HDR-supporting source, such as a UHD Blu-Ray player, streaming service, or home console. Bear in mind, however, that the PS5 only supports HDR10, while the Series X and S support both HDR10 and Dolby Vision technologies.

Home PCs with recent-generation graphics cards and an HDR monitor can also take advantage. Support has improved over the years, in particular with Windows 11 which introduced some automated HDR features in certain games.

But no matter your hardware setup, you also need to get the HDR signal from the source to the display. That's where high-quality cables from companies like Cable Matters can really make a difference, since the extra bandwidth required to encode the HDR data is quite demanding, so any small issue in cable quality may result in your loss of an HDR image.

To run HDR content at 4K 60Hz resolution, you need an HDMI 2.1DisplayPort 1.4, or DisplayPort 2.1 cable. They not only support the latest HDR technologies but higher refresh rates and 8K resolution too.

Add comment

This website uses cookies to ensure you get the best experience on our website. Learn More

OK