HDR and Dolby Vision: all you need to know about these display technologies

Less media-intensive than UHD-4K, HDR (High Dynamic Range) and Dolby Vision technologies are just as revolutionary for our TVs and probably more spectacular to the eye than the transition from HD to Ultra High Definition.

Why is this? Because they make it possible to display an image with greater differences in brightness between its darkest and lightest parts than with standard TVs (SDR = Standard Dynamic Range). And because the human eye is extremely sensitive to variations in light intensity, the result in the picture is immediately perceptible: greater intra-image contrast, more gradation in the brightness scale, therefore also in the colours, and more visible detail in both the brightest and darkest parts of the picture.
By providing images with higher peak brightness and greater subtlety in contrast, UHD-4K Blu-ray players, UHD-4K TVs, and HDR and Dolby Vision compatible projectors are approaching the perceptual potential of the human eye. Offering a wider color space, HDR 10 and Dolby Vision standards enable images to be displayed with unparalleled vividness and realism thanks to greater brightness and a wider range of colour tones.

HDR = High Dynamic Range

Photographers are certainly familiar with this term: the dynamic range of an image is the range measured between its lightest and darkest parts. The wider the dynamic range, the more contrast and legibility there is in both dark and light areas of the image. Blacks and whites are more nuanced, and colors are richer and more subtle.
The introduction of a wide dynamic range in video makes it possible to distinguish many details from the darkest to the brightest areas of the image. This requires, however, that all devices used for image capture (cameras), post-production (editing, digital effects…) and broadcasting (broadcast services, televisions…) be compatible.

The legacy of the cathode ray tube…


It is the capabilities of cathode ray tube (CRT) displays, with their limitations, that were initially used as a standard for the picture quality of colour television. The resolution, dynamic range, color gamut and even the display frequency of filmed and broadcast video images were intimately linked to this now obsolete technology.
The arrival of high definition (HD 720p then HD 1080p) made it possible to evolve the resolution of images to a definition of 1920 x 1080 pixels and to improve colorimetry with the adoption of the Rec.709 color space (8-bit color coding, i.e. 256 values, covering approximately 35% of the spectrum visible to the human eye), paving the way for the Deep Color and x.v.Colour technologies implemented on LCD and plasma HD and Full HD TVs (HDTV and HDTV 1080p).
Ultra High Definition (UHD or 4K) then pushed the nail in by quadrupling the resolution of images (UHD = 3840 x 2160 pixels) and adopting an even wider colour space (Rec.2020, 10-bit colour coding, i.e. 1024 values) so that it covers 75% of the spectrum visible to the human eye.
However, this meant breaking away from the limit of 100 candelas/m² that characterizes the peak brightness of images displayed by a television set, which for all these years has been based on the physical limits of cathode ray tubes, whereas the human eye is capable of adapting to brightness levels ranging from 0.0001 to 10,000 candelas/m²! (The candela/m² is the unit of measurement for the luminous intensity of a light source.)
This is done with UHD and the SMTPE ST2084 EOTF standard, which therefore defines a wider dynamic range for Ultra HD Premium (HDR 10 compatible) TVs, based on the actual contrast sensitivity of the human eye, and allowing a wide luminance range from 0 to 10,000 candelas/m² with a minimum of 10-bit luminance sampling.

Different standards?

Ultra HD Premium


This is the highest official certification for a UHD TV. It is issued by the Ultra HD Alliance to Ultra High Definition televisions offering at least the following features:

  • Display resolution: 3840 x 2160 pixels.
  • Colour depth: 10 bits.
  • Colour space: Rec.2020 compatible HDMI input and the ability to display 90% or more of the DCI-P3 colour space (colour space used for digital cinema, covering approximately 85% of the spectrum visible to the human eye).
  • HDR compatibility: Maximum brightness of at least 1000 cd/m² with a black level of 0.05 cd/m² or less for LED TVs or maximum brightness of at least 540 cd/m² with a black level of 0.0005 cd/m² or less for OLED TVs (SMTPE ST2084 EOTF standard). The chosen standard is the open standard known as HDR 10.


HDR10

It is an open platform-based standard, called HDR 10 because it uses a 10-bit colour quantization scale. Any TV manufacturer who uses it can implement it as they wish (it is sometimes referred to as “HDR 1000” by some LED LCD TV manufacturers).
It is the HDR standard chosen by the UHD Alliance for the Ultra HD Premium label and for the calibration of films marketed on UHD Blu-ray media. It competes with Dolby Vision, a proprietary HDR standard developed by Dolby Laboratories, which uses 12-bit quantization.

HDR10+


This HDR standard was developed by the UHD Alliance to compete with Dolby Vision. Like Dolby Vision, the HDR10+ standard uses dynamic metadata embedded in the video stream so that the compatible broadcaster (TV, video projector) can optimize its display, scene by scene. This open standard is supported by 20th Century Fox, Panasonic Corporation and Samsung, among others.

Dolby Vision

Dolby Vision is an HDR technology developed by Dolby Laboratories that goes beyond the HDR 10 standard. It offers 12-bit quantization and, above all, a true workflow (the process of implementing the standard) with rigorous control from master calibration in post-production to viewing, the only guarantee that the image broadcast to the final viewer will truly conform to the original desired by the director.

Unlike the HDR 10 standard, which is open, Dolby Vision is a proprietary standard. This means that Dolby Vision certified hardware must be used for post-production, broadcasting (broadcast, physical media) and display. This is a constraint throughout the entire image production and broadcast chain, but it is what guarantees that the final display, on the TV set or via a compatible video projector, will be absolutely consistent with the director’s vision. The Dolby Vision standard allows
the film’s creator(s) to precisely calibrate the brightness, contrast and colours of each shot and to add metadata to the “calibrated” image, which will ultimately be interpreted by the Dolby Vision-compatible television set to display the image as intended by its creators. The Dolby Vision processor (built into the television or video projector) can even adapt the picture rendering to the specific technical characteristics of the display device in which it is integrated.

Hybrid Log Gamma (HLG)

This is a royalty-free HDR standard developed jointly by the BBC and NHK for broadcasting TV programmes with High Dynamic Range images. The standard has been adopted by the DVB (Digital Video Broadcasting) consortium, the ITU (International Telecommunication Union) and the HDMI Forum (HDMI 2.0b standard).

HDR, HDR Pro

The mention HDR and its derivatives HDR Pro, HDR Plus… (not to be confused with the HDR 10 standard) do not constitute an official quality label or a clearly defined standard (HDR simply means “High Dynamic Range”). HDR Plus… (not to be confused with the HDR 10 standard) is not an official quality label or a clearly defined standard (HDR simply stands for High Dynamic Range). It indicates that a TV set has a wider dynamic range than a standard TV set (SDR), which is limited to 100 nits (or 100 candelas/m²). Brands whose TVs do not meet all the criteria for the Ultra HD Premium label, but which still offer a peak brightness greater than 100 nits, therefore use this designation to distinguish themselves from standard TVs.

In their technical data sheets, some manufacturers add the maximum brightness of the television, expressed in nits, in addition to HDR.


    Leave a Reply