Video: HDR Formats and Trends

As HDR continues its slow march into use, its different forms both in broadcast and streaming can be hard to keep track of and even differentiate. This talk from the Seattle Video Tech meetup aims to tease out these details. Whilst HDR has long been held up as a perfect example of ‘better pixels’ and many have said they would prefer to deploy HD video plus HDR rather than moving in to UHD at the same time as introducing HDR, few have followed through.

Brian Alvarez from Amazon Prime Video starts with a very brief look at how HDR has been created to sit on top of the existing distribution formats: HLS, DASH, HEVC, VP9, AV1, ATSC 3.0 and DVB. The way it does this is in a form based on either HLG or PQ.

Brian takes some time to discuss the differences between the two approaches to HDR. First off, he looks at HLG which is an ARIB standard freely available, though still with licencing. This standard is, technically, backwards compatible with SDR but most importantly doesn’t require metadata which is a big benefit in the live environment and simplifies broadcast. PQ is next, and we hear about the differences in approach from HLG with the suggestion that this gives better visual performance than HLG. In the PQ ecosystem, Brian works through the many standards explaining how they differ and we see that the main differences are in in colour space and bit-depth.

The next part of the talk looks at the, now famous, venn diagrams (by Yoeri Geutskens) showing which companies/products support each variant of HDR. This allows us to visualise and understand the adoption of HDR10 vs HLG for instance, to see how much broadcast TV is in PQ and HLG, to see how the film industry is producing exclusively in PQ and much more. Brian comments and gives context to each of the scenarios as he goes.

Finally a Q&A session talks about displays, end-to-end metadata flow, whether customers can tell the difference, the drive for HDR adoption and a discussion on monitors for grading HDR.

Watch now! / Download the Slides

Speaker

Brian Alvarez Brian Alvarez
Principal Product Manager,
Amazon Prime Video

Video: The ST 2094 Standards Suite For Dynamic Metadata

Lars Borg explains to us what problems the SMPTE ST 2094 standard sets out to solve. Looking at the different types of HDR and Wide Colour Gamut (WCG) we quickly see how many permutations there are and how many ways there are to get it wrong.

ST 2094 carries the metadata needed to manage the colour, dynamic range and related data. In order to understand what’s needed, Lars takes us through the details of the HDR implementations, touching on workflows and explaining how the ability of your display affects the video.

We then look at midtones and dynamic metadata before a Q&A.

This talk is very valuable in understanding the whole HDR, WCG ecosystem as much as it is ST 2094.

Watch now!

Speaker

Lars Borg Lars Borg
Principal Scientist,
Adobe