Video: Usage of Video Signaling Code Points Automating UHD & HD Production-to-Distribution Workflows

As complicated as SD to HD conversions seemed at the time, that’s nothing on the plethora of combinations available now. Dealing with BT 601 and 709 colour spaces along with aspect ratios and even conversions from NTSC/PAL kept everyone busy. With frame rates, different HDR formats and wide colour gamut (HDR) being just some of the current options, this talk considers whether it would be better to bring in a ‘house format’ as opposed to simply declaring your company to be a ‘ProRes HQ’ house and accepting any content, HDR or SDR, in ProRes rather than being more specific regarding the lifestyle of your videos.

This talk from Chris Seeger from NBCUniversal and Yasser Syed from Comcast discuss their two-year effort to document common workflow video format combinations talking to companies from content providers to broadcasters to service distributors. The result is a joint ITU-ISO document, now in its second edition, which provides a great resource for new workflows today.

Yasser makes the point that, in recent years, the volume of scripted workflows has increased significantly. This can motivate broadcasters to find quicker and more efficient ways of dealing with media in what can be a high-value set of workflows that are increasingly being formed from a variety of video types.

Discussing signalling is important because it brings workflows together. Looking at videos we see that multiple sources arrive on left, need to identify correctly and then converted. This video talks about keeping separate video codecs and the identifying metadata needed for contribution and distribution which is best done automatically. All combinations are possible, but take advantages o the best content, having everything converted into a single, HDR-friendy mezzanine format is the way forward.

Watch now!
Speakers

Yasser Syed Yasser Syed
Comcast Distinguished Engineer,
Comcast
Chris Seeger Chris Seeger
Director, Advanced Content Production Technology,
NBCUniversal, Inc.

Video: UHD and HDR at the BBC – Where Are We Now, and Where Are We Going? –

Has UHD been slow to roll out? Not so, we hear in this talk which explains the work to date in standardising, testing and broadcasting in UHD by the BBC and associated organisations such as the EBU.

Simon Thompson from BBC R&D points out that HD took decades to translate from an IBC demo to an on-air service, whereas UHD channels surfaced only two years after the first IBC demonstration of UHD video. UHD has had a number of updates from the initial resolution focused definition which created UHD-1, 2160p lines high and UHD-2 which is often called 8K. Later, HDR with Wide Colour Gamut (WCG) was added which allowed the image to much better replicate the brightnesses the eye is used to and almost all of the naturally-occurring colours; it turns out that HD TV (using REC.709 colour) can not reproduce many colours commonly seen at football matches.

In fact, the design brief for HDR UHD was specifically to keep images looking natural which would allow better control over the artistic effect. In terms of HDR, the aim was to have a greater range than the human eye for any one adpation state. The human eye can see an incredible range of brightnesses, but it does this by adapting to different brightness levels – for instance by changing the pupil size. When in a fixed state the eye can only access a subset of sensitivity without further adapting. The aim of HDR is to have the eye in one adaptation state due to the ambient brightness, then allow the TV to show any brightness the eye can then hold.

Simon explains the two HDR formats: Dolby’s PQ widely adopted by the film industry and the Hybrid Log-Gamma format which is usually favoured by broadcasters who show live programming. PQ, we hear from Simon, covers the whole range of the human visual system meaning that any PQ stream has the capability to describe images from 1 to 10,000 Nits. In order to make this work properly, the mix needs to know the average brightness level of the video which will not be available until the end of the recording. It also requires sending metadata and is dependent on the ambient light levels in the room.

Hybrid Log-Gamma, by contrast, works on the fly. It doesn’t attempt to send the whole range of human eye and no metadata needed. This lends itself well to delivering HDR for live productions. To learn more about the details of PQ and HLG, check out this video.

Simon outlines the extensive testing and productions done in UHD and looks at the workflows possible. The trick has been finding the best way to produce both an SDR and an HDR production at the same time. The latest version that Simon highlights had all the 70 cameras being racked in HDR by people looking at the SDR down-mix version. The aim here is to ensure that the SDR version looks perfect, as it still serves over 90% of the viewership. However, the aim is to move to a 100% HDR production with SDR being derived off the back of that without any active monitoring. The video ends with a look to the challenges yet to be overcome in UHD and HDR production.

Watch now!
Speaker

Simon Thompson Simon Thompson
Senior R&D Engineer
BBC R&D

Video: 5 Myths About Dolby Vision & HDR debunked

There seem no let up in the number of technologies coming to market and whilst some, like HDR, have been slowly advancing on us for many years, the technologies that enable them such as Dolby Vision, HDR10+ and the metadata handling technologies further upstream are more recent. So it’s no surprise that there is some confusion over what’s possible and what’s not.

In this video, Bitmovin and Dolby the truth behind 5 myths surrounding the implementation and financial impact of Dolby Vision and HDR in general. Bitmovin sets the scene by with Sean McCarthy giving an overview on their research into the market. He explains why quality remains important, simply put to either keep up with competitors or be a differentiator. Sean then gives an overview of the ‘better pixels’ principle underlining that improving the pixels themselves is often more effective than higher resolution, technologies such as wide colour gamut (WCG) and HDR.

David Brooks then explains why HDR looks better, explaining the biology and psychology behind the effect as well as the technology itself. The trick with HDR is that there are no extra brightness values for the pixels, rather the brightness of each pixel is mapped onto a larger range. It’s this mapping which is the strength of the technology, altering the mapping gives different results, ultimately allowing you to run SDR and HDR workflows in parallel. David explains how HDR can be mapped down to low-brightness displays,

The last half of this video is dedicated to the myths. Each myth has several slides of explanation, for instance, the one suggests that the workflows are very complex. Hangen Last walks through a number of scenarios showing how dual (or even three-way) workflows can be achieved. The other myths, and the questions at the end, talk about resolution, licensing cost, metadata, managing dual SDR/HDR assets and live workflows with Dolby Vision.

Watch now!
Speakers

David Brooks David Brooks
Senior Director, Professional Solutions,
Dolby Laboratories
Hagan Last Hagan Last
Technology Manager, Content Distribution,
Dolby Laboratories
Sean McCarthy Sean McCarthy
Senior Technical Product Marketing Manager,
Bitmovin
Kieran Farr Moderator: Kieran Farr
VP Marketing,
Bitmovin

Video: Hybrid SDI/ST 2110 Workflows

It’s no secret that SDI is still the way to go for some new installations. For all the valid interest in SMPTE’s ST 2110, the cost savings are only realised either on a large scale or in the case that a system needs continuous flexibility (such as an OB truck) or scalability in the future. Those installations which have gone IP still have some SDI lying around somewhere. Currently, there are few situations where there is an absolute ‘no SDI’ policy because there are few business cases which can afford it.

Looking at the current deployments of broadcast 2110, we have large, often public, broadcasters who are undergoing a tech refresh for a building and can’t justify such as massive investment in SDI or they are aiming to achieve specific savings such as Discovery’s Eurosport Transformation Project which is an inspirational, international project to do remote production for whole buildings. We also have OB trucks who benefit significantly from reduced cabling, higher density routing and flexibility. For a more detailed view on 2110 in trucks, watch this video from NEP. In these scenarios, there is nearly always SDI still involved. Some equipment doesn’t yet work fully in 2110, some doesn’t yet work at all and while there are IP versions of some products, the freelance community still needs to learn how to use the new products or work in the new workflows. If you have a big enough project, you’ll hit the ‘vendor not yet ready’ problem, if you have an OB-truck or similar, you are likely to have to deal with the freelance experience issue. Both are reducing, but are still real and need to be dealt with.

Kevin Salvidge from Leader joins the VSF’s Wes Simpson to share his experience of these SDI/IP mixed workflows, many of which are in OB trucks so also include mixed HDR workflows. He starts by talking about PTP and GPS discussing how timing needs to be synced between locations. He then takes a closer look at the job of the camera shaders who make sure all the cameras have the same colour, exposure etc. Kevin talks about how live production in HDR and SDR work touching on the problem of ‘immediacy’. Shaders need to swap between cameras quickly and are used to the immediate switch that SDI can provide. IP can’t offer quite the same immediacy, Kevin says that some providers have added delays into the SDI switches to match the IP switch times within the same truck. This helps set expectations and stop operators pressing two or more times to get a switch made.

Kevin finishes his talk on the topic of synchronising analogue timing signals with PTP. Kevin shows us the different tools you can use to monitor these signals such as a display of PTP timing against B&B timing, a BMCA data readout of data from the PTP grandmasters to check if the BMCA algorithm is working correctly, PTP delay time, packet inter-arrival time, path delay, traffic shaping monitoring. He then closes with a Q&A talking about the continued prevalence of SDI, what ‘eye patterns’ are in the IP world and increasing HDR roll-outs.

Watch now!
Speaker

Kevin Slavidge
European Regional Development Manager
Leader Europe Ltd.
Wes Simpson Moderator: Wes Simpson
President, Telcom Product Consulting
Owner, LearnIPVideo.com