The Broadcast Knowledge exists to help individuals up-skill whatever your starting point. Videos like this are far too rare giving an introduction to a large number of topics. For those starting out or who need to revise a topic, this really hits the mark particularly as there are many new topics.
John Mailhot takes the lead on SMPTE 2110 explaining that it’s built on separate media (essence) flows. He covers how synchronisation is maintained and also gives an overview of the many parts of the SMPTE ST 2110 suite. He talks in more detail about the audio and metadata parts of the standard suite.
Eric Gsell discusses digital archiving and the considerations which come with deciding what formats to use. He explains colour space, the CIE model and the colour spaces we use such as 709, 2100 and P3 before turning to file formats. With the advent of HDR video and displays which can show bright video, Eric takes some time to explain why this could represent a problem for visual health as we don’t fully understand how the displays and the eye interact with this type of material. He finishes off by explaining the different ways of measuring the light output of displays and their standardisation.
Yvonne Thomas talks about the cloud starting by explaining the different between platform as a service (PaaS), infrastructure as a service (IaaS) and similar cloud terms. As cloud migrations are forecast to grow significantly, Yvonne looks at the drivers behind this and the benefits that it can bring when used in the right way. Using the cloud, Yvonne shows, can be an opportunity for improving workflows and adding more feedback and iterative refinement into your products and infrastructure.
Looking at video deployments in the cloud, Yvonne introduces video codecs AV1 and VVC both, in their own way, successors to HEVC/h.265 as well as the two transport protocols SRT and RIST which exist to reliably send video with low latency over lossy networks such as the internet. To learn more about these protocols, check out this popular talk on RIST by Merrick Ackermans and this SRT Overview.
Rounding off the primer is Linda Gedemer from Source Sound VR who introduces immersive audio, measuring sound output (SPL) from speakers and looking at the interesting problem of forward speakers in cinemas. The have long been behind the screen which has meant the screens have to be perforated to let the sound through which interferes with the sound itself. Now that cinema screens are changing to be solid screens, not completely dissimilar to large outdoor video displays, the speakers are having to move but now with them out of the line of sight, how can we keep the sound in the right place for the audience?
This video is a great summary of many of the key challenges in the industry and works well for beginners and those who just need to keep up.
In some parts of the industry UHD is entirely absent. Thierry Fautier is here to shine a light on the progress being made around the globe in deploying UHD.
Thierry starts off by defining terms – important because Ultra HD actually hides several, often unmentioned, formats behind the term ‘UHD’. This also shows how all of the different aspects of UHD, which include colour (WCG), HDR, audio (NGA) and frame rate to name only a few, fit together.
There’s then a look at the stats, where is HDR deployed? How is UHD typically delivered? And the famed HDR Venn diagram showing which TVs support which formats.
As ever, live sports is a major testing ground so the talk examines some lessons learnt, and features a BBC case study, from the 2018 World Cup. Not unrelated, there is a discussion on the state of UHD streaming including discussion of CMAF.
Leading nicely onto Content Aware Encoding (CAE), which was also in use at the world cup.
With all the talk of IP, you’d be wrong to think SDI is dead. 12G for 4K is alive and well in many places, so there’s plenty of appetite to understand how it works and how to diagnose problems.
In this double-header, Steve Holmes from Tektronix takes us through the ins and outs of HDR and also SDI for HDR at the SMPTE SF section.
Steve starts with his eye on the SMPTE standards for UHD SDI video looking at video resolutions and seeing that a UHD picture can be made up of 4 HD pictures which gives rise to two well known formats ‘Quad split’ and ‘2SI’ (2 Sample Interleave).
Colour is the next focus and a discussion on the different colour spaces that UHD is delivered with (spoiler: they’re all in use), what these look like on the vector scope and look at the different primaries. Finishing up with a roundup and a look at interlink timing, there’s a short break before hitting the next topic…HDR
High Dynamic Range is an important technology which is still gaining adoption and is often provided in 4K programmes. Steve defines the two places HDR is important; in the acquisition and the display of the video then provides a handy lookup table of terms such as HDR, WCG, PQ, HDR10, DMCVT and more.
Steve gives us a primer on what HDR is in terms of brightness ‘NITS’, how these relate to real life and how we talk about it with respect to the displays. We then look at HDR on the waveform monitor and look at features of waveform monitors which allow engineers to visualise and check HDR such as false colour.
The topic of gamma, EOTFs and colour spaces comes up next and is well explained building on what came earlier. Before the final demo and Q&A, Steve talks about different ways to grade pictures when working in HDR.
Webinar date: Thursday May 30th 2019
Time: 16:00 BST / 11 am EST / 8 am PDT
Experienced advice is on hand in this webinar for those producing in HDR and UHD. Productions are always trying to raise the quality of acquisition in order to deliver better quality to the viewers, to enhance creative possibilities and to maximise financial gain by future proofing their archives. But this push always brings challenges in production and the move to UHD and HDR is no different.
HDR and UHD are not synonymous, but often do go hand-in-hand. This is partly because the move to UHD is a move to improve quality, but time and again we hear the reasons that increasing resolution in and of itself is not always an improvement. Rather the ‘better pixels’ mantra seeks to improve quality through improving the video using a combination of resolution, frame-rate, HDR and Wide Colour Gamut (WCG). So when it’s possible, HDR and WCG are often combined with UHD.
In this webinar, we hear the challenges on the way to success met by director and producer Pamela Ann Berry and The Farm Group. Register to hear them share their tips and tricks for better UHD and HDR production.