Has UHD been slow to roll out? Not so, we hear in this talk which explains the work to date in standardising, testing and broadcasting in UHD by the BBC and associated organisations such as the EBU.
Simon Thompson from BBC R&D points out that HD took decades to translate from an IBC demo to an on-air service, whereas UHD channels surfaced only two years after the first IBC demonstration of UHD video. UHD has had a number of updates from the initial resolution focused definition which created UHD-1, 2160p lines high and UHD-2 which is often called 8K. Later, HDR with Wide Colour Gamut (WCG) was added which allowed the image to much better replicate the brightnesses the eye is used to and almost all of the naturally-occurring colours; it turns out that HD TV (using REC.709 colour) can not reproduce many colours commonly seen at football matches.
In fact, the design brief for HDR UHD was specifically to keep images looking natural which would allow better control over the artistic effect. In terms of HDR, the aim was to have a greater range than the human eye for any one adpation state. The human eye can see an incredible range of brightnesses, but it does this by adapting to different brightness levels – for instance by changing the pupil size. When in a fixed state the eye can only access a subset of sensitivity without further adapting. The aim of HDR is to have the eye in one adaptation state due to the ambient brightness, then allow the TV to show any brightness the eye can then hold.
Simon explains the two HDR formats: Dolby’s PQ widely adopted by the film industry and the Hybrid Log-Gamma format which is usually favoured by broadcasters who show live programming. PQ, we hear from Simon, covers the whole range of the human visual system meaning that any PQ stream has the capability to describe images from 1 to 10,000 Nits. In order to make this work properly, the mix needs to know the average brightness level of the video which will not be available until the end of the recording. It also requires sending metadata and is dependent on the ambient light levels in the room.
Hybrid Log-Gamma, by contrast, works on the fly. It doesn’t attempt to send the whole range of human eye and no metadata needed. This lends itself well to delivering HDR for live productions. To learn more about the details of PQ and HLG, check out this video.
Simon outlines the extensive testing and productions done in UHD and looks at the workflows possible. The trick has been finding the best way to produce both an SDR and an HDR production at the same time. The latest version that Simon highlights had all the 70 cameras being racked in HDR by people looking at the SDR down-mix version. The aim here is to ensure that the SDR version looks perfect, as it still serves over 90% of the viewership. However, the aim is to move to a 100% HDR production with SDR being derived off the back of that without any active monitoring. The video ends with a look to the challenges yet to be overcome in UHD and HDR production.
Watch now!
Speaker
Simon Thompson Senior R&D Engineer BBC R&D |