Video: HDR Formats and Trends

As HDR continues its slow march into use, its different forms both in broadcast and streaming can be hard to keep track of and even differentiate. This talk from the Seattle Video Tech meetup aims to tease out these details. Whilst HDR has long been held up as a perfect example of ‘better pixels’ and many have said they would prefer to deploy HD video plus HDR rather than moving in to UHD at the same time as introducing HDR, few have followed through.

Brian Alvarez from Amazon Prime Video starts with a very brief look at how HDR has been created to sit on top of the existing distribution formats: HLS, DASH, HEVC, VP9, AV1, ATSC 3.0 and DVB. The way it does this is in a form based on either HLG or PQ.

Brian takes some time to discuss the differences between the two approaches to HDR. First off, he looks at HLG which is an ARIB standard freely available, though still with licencing. This standard is, technically, backwards compatible with SDR but most importantly doesn’t require metadata which is a big benefit in the live environment and simplifies broadcast. PQ is next, and we hear about the differences in approach from HLG with the suggestion that this gives better visual performance than HLG. In the PQ ecosystem, Brian works through the many standards explaining how they differ and we see that the main differences are in in colour space and bit-depth.

The next part of the talk looks at the, now famous, venn diagrams (by Yoeri Geutskens) showing which companies/products support each variant of HDR. This allows us to visualise and understand the adoption of HDR10 vs HLG for instance, to see how much broadcast TV is in PQ and HLG, to see how the film industry is producing exclusively in PQ and much more. Brian comments and gives context to each of the scenarios as he goes.

Finally a Q&A session talks about displays, end-to-end metadata flow, whether customers can tell the difference, the drive for HDR adoption and a discussion on monitors for grading HDR.

Watch now! / Download the Slides

Speaker

Brian Alvarez Brian Alvarez
Principal Product Manager,
Amazon Prime Video

Video: User-Generated HDR is Still Too Hard

HDR and wide colour gamuts are difficult enough in professional settings – how can YouTube get it right with user-generated content?

Steven Robertson from Google explains the difficulties that YouTube has faced in dealing with HDR in both its original productions but also in terms of user generated content (UGC). These difficulties stem from the Dolby PQ way of looking at the world with fixed brightnesses and the ability to go all the way up to 10,000 nits of brightness and also from the world of wider colour gamuts with Display P3 and BT.2020 (WCG).

Viewing conditions have been a challenge right from the beginning of TV but ever more so now with screens of many different shapes and sizes being available with very varied abilities to show brightness and colour. Steven spends some time discussing the difficulty of finding a display suitable for colour grading and previewing your work on – particularly for individual users who are without a large production budget.

Interestingly, we then see that one of the biggest difficulties is in visual perception which makes colours you see after having seen bad colours look much better. HDR can deliver extremely bright and extremely wrong colours. Steven shows real examples from YouTube of where the brain has been tricked into thinking colour and brightness are correct but they clearly are not.

Whilst it’s long been known that HDR and WCG are inextricably linked with human vision, this is a great insight into tackling this at scale and the research that has gone on to bring this under automated control.

Watch now!
Free registration required

This talk is from Streaming Tech Sweden, an annual conference run by Eyevinn Technology. Videos from the event are available to paid attendees but are released free of charge after several months. As with all videos on The Broadcast Knowledge, this is available free of charge after registering on the site.

Speaker

Steven Robertson Steven Robertson
Software Engineer, YouTube Player Infrastructure
Google

Video: Deployment of Ultra HD Services Around the Globe

In some parts of the industry UHD is entirely absent. Thierry Fautier is here to shine a light on the progress being made around the globe in deploying UHD.

Thierry starts off by defining terms – important because Ultra HD actually hides several, often unmentioned, formats behind the term ‘UHD’. This also shows how all of the different aspects of UHD, which include colour (WCG), HDR, audio (NGA) and frame rate to name only a few, fit together.

There’s then a look at the stats, where is HDR deployed? How is UHD typically delivered? And the famed HDR Venn diagram showing which TVs support which formats.

As ever, live sports is a major testing ground so the talk examines some lessons learnt, and features a BBC case study, from the 2018 World Cup. Not unrelated, there is a discussion on the state of UHD streaming including discussion of CMAF.

Leading nicely onto Content Aware Encoding (CAE), which was also in use at the world cup.

Watch now!
Free registration required

Speaker

Thierry Fautier Thierry Fautier
President-Chair, Ultra HD Forum
VP Video Strategy, Harmonic

Video: Intro to 4K Video & HDR

With all the talk of IP, you’d be wrong to think SDI is dead. 12G for 4K is alive and well in many places, so there’s plenty of appetite to understand how it works and how to diagnose problems.

In this double-header, Steve Holmes from Tektronix takes us through the ins and outs of HDR and also SDI for HDR at the SMPTE SF section.

Steve starts with his eye on the SMPTE standards for UHD SDI video looking at video resolutions and seeing that a UHD picture can be made up of 4 HD pictures which gives rise to two well-known formats ‘Quad split’ and ‘2SI’ (2 Sample Interleave).

Colour is the next focus and a discussion on the different colour spaces that UHD is delivered with (spoiler: they’re all in use), what these look like on the vectorscope and look at the different primaries. Finishing up with a roundup and a look at interlink timing, there’s a short break before hitting the next topic…HDR

High Dynamic Range is an important technology which is still gaining adoption and is often provided in 4K programmes. Steve defines the two places HDR is important; in the acquisition and the display of the video then provides a handy lookup table of terms such as HDR, WCG, PQ, HDR10, DMCVT and more.

Steve gives us a primer on what HDR is in terms of brightness ‘NITS’, how these relate to real life and how we talk about it with respect to the displays. We then look at HDR on the waveform monitor and look at features of waveform monitors which allow engineers to visualise and check HDR such as false colour.

The topic of gamma, EOTFs and colour spaces comes up next and is well-explained building on what came earlier. Before the final demo and Q&A, Steve talks about different ways to grade pictures when working in HDR.

A great intro to the topics at hand – just like Steve’s last one: Uncompressed Video over IP & PTP Timing

Watch now!

Speakers

Steve Holmes Steve Holmes
Former Senior Applications Engineer,
Tektronix