Last Updated February 13,2019
An increasing number of manufacturers are introducing HDR enabled TVs and projectors. These devices display images with higher brightness, vibrant and life-like colors, and superior contrast levels as compared to SDR TVs and projectors.
JMGO N7L 1080p portable smart projector with HDR10 decode
HDR (high dynamic range) and SDR (standard dynamic range) define the capability of devices to produce rich images or videos. An image can be characterized by its brightness, color levels and black levels among other things. SDR devices support a lower range of these characteristics as compared to HDR devices.
An HDR device will be able to produce higher levels of white and lower levels of black as compared to an SDR device. HDR devices also support a wider range of color gamut, i.e. they are able to produce more shades of colors than SDR devices. A viewer may be able to distinguish between a dark shade of grey and black on an HDR device, but an SDR device may show them to be the same. Simply put, HDR devices display richer and more realistic images as compared to SDR devices.
There are different HDR standards in existence, each backed by its own set of promoters. These include the HDR10, the Dolby Vision HDR, and the recently introduced HLG HDR. Each of these standards has specific features and characteristics.
HDR10 is an open HDR standard overseen by the Consumer Technology Association (CTA) which is a body comprising of over two thousand technology companies. It is also known as the HDR10 Media Profile and was introduced in the year 2015. The standard defines the minimum requirements that a device must meet to qualify as HDR10 capable. HDR10 devices are required to support REC.2020 color space and a color depth of 10-bit. They must also be capable of transmitting and decoding “Mastering Display Color Volume” static metadata that defines the characteristics for the whole video.
HDR10 devices can theoretically support a maximum brightness of up to 4,000-nits although most models available in the market are rated at 1,000-nits. As it is an open standard, any manufacturer can use it without having to pay a licensing fee. A wide range of display devices, including TVs and projectors, as well as playback devices from renown manufacturers support HDR10. HDR10 is not backward compatible with SDR.
Dolby Vision HDR
Dolby Vision is a proprietary standard owned and developed by Dolby Laboratories. Any manufacturer who wants to incorporate Dolby Vision standard into their TV requires approval from Dolby and has to pay a licensing fee or royalty on a per-device basis for the same.
LG's Dolby Vision TV
Dolby Vision devices are required to support 12-bit color depth and conform to REC.2020 color space. These devices can have a maximum peak brightness of 10,000-nits though most devices available today have a peak brightness of around 4,000-nits.
One of the biggest advantages of the Dolby Vision standard is its dynamic metadata capability. It allows content creators greater control over their content by enabling them to define information about each shot or frame in a video. They can control the colors, lighting and other aspects of each shot individually.
HDR10, on the other hand, only allows static metadata wherein content creators can define the colors, black levels and other aspects for an entire video and not on a per shot basis. Dolby Vision is backward compatible and also supports content with static metadata created for HDR10.
HDR10+ or HDR10 Plus is a revision of the HDR standard that has been developed jointly by a few prominent manufacturers of display devices in partnership with movies studios who are together known as the HDR10+ Alliance. Introduced in 2017, it aims to overcome the static metadata limitation of the HDR10 standard. The HDR10+ standard supports dynamic metadata allowing content creators to define the appearance of each scene or frame of a movie or a video individually just like the Dolby Vision standard.
Samsung HDR10+ TV
HDR10+ capable devices have a color depth of 10-bits with a peak brightness of up to 4,000-nits. HDR10+ is also an open standard just like the HDR10 standard and manufacturers don’t have to pay any royalty fee for either of them. Since its introduction, many companies have announced their support for the standard.
Hybrid Log-Gamma HDR
Hybrid Log-Gamma HDR is yet another HDR standard that has been introduced in recent years. The primary difference between other HDR standards and Hybrid Log-Gamma is that while standards like HDR10 and Dolby Vision are used for streaming or playback, the Hybrid Log-Gamma primarily focuses on broadcasting, i.e. it has been created as a solution for high-quality cable TV, satellite TV, and live TV.
These services required a separate standard as they are limited by their bandwidths and the use of metadata can be tricky for live coverage. Another factor is that quite a few users still have legacy equipment and a solution was needed that could be compatible with both older and newer variants of TVs and projectors used for broadcasting. Hence, HLG is designed to be backward compatible with SDR displays. The requirements for HLG capability include a 10-bit color depth and a nominal peak luminance of 1,000-nits, but it will work with TVs and display devices having a lower brightness rating as well.
HLG can be considered as a complementary standard to HDR10 and Dolby Vision. While streaming and recorded content are displayed either in HDR10 or Dolby Vision, HLG is focused on broadcasting. It means that not only you get to see movies and online content in high quality but also sports and live TV. Multiple video services are supporting Hybrid Log-Gamma including BBC and Direct TV while several manufacturers have also released firmware updates for their display devices to support HLG HDR.
While the different HDR technologies can be a bit confusing for the average consumer, choosing a TV, projector or other display device is relatively easy. It is because manufacturers can support any of these HDR technologies with a firmware update.
Most TVs will either support HDR10/HDR10+ or Dolby Vision depending on the technology their manufacturers are promoting. For example, Samsung is a proponent of the HDR10+ technology while LG displays feature the Dolby Vision technology. However, some manufacturers like Panasonic have introduced models that support both Dolby Vision and HDR10+.
Panasonic GZ2000 With HDR10+ and Dolby Vision Support
Dolby Vision is the most popular standard among content creators as it offers greater control with its dynamic metadata and 12-bit color depth though it is bound to change with the introduction of HDR10+. The amount of HDR10+ content is increasing gradually since its implementation.
Dolby also ensures that every TV or projector model it licenses displays the content exactly as it is intended to be as a service in return for the licensing fee. HDR10/HDR10+ TVs have to be calibrated and optimized by their manufacturers, and a model from one manufacturer may show an image with slight variation as compared to a model with similar specifications from another manufacturer.
HLG is a broadcasting standard, and many TV manufacturers are adding support for the standard via firmware updates. If a TV supports either HDR10 or Dolby Vision but is not yet compatible with HLG, its manufacturer may add support for it via a firmware update later.
Consumers should also consider the standard that their playback devices and consoles support. Some of them may support HDR10 while others may support Dolby Vision. A buyer can check the compatibility of their playback devices while buying a TV or projector.
The final choice between the three will depend on the viewer. Some viewers may not be able to notice significant differences between HDR10 and Dolby Vision while others may prefer one over the other. Buying a TV doesn’t mean that the viewer will be limited to one of these formats. Consumers can consider a TV or projector that supports all three HDR standards for a more comprehensive viewing experience.