Video: Broadcast Fundamentals: High Dynamic Range

Update: Unfortunately CVP choose to take down this video within 12 hours of this article going live. But there’s good news if you’re interested in HDR. Firstly, you can find the outline and some of the basics of the talk explained below. Secondly, at The Broadcast Knowledge there are plenty of talks discussing HDR! Here’s hoping CVP bring the video back.

Why is High Dynamic Range is like getting a giraffe on a tube train? HDR continues its ascent. Super Bowl LIV was filmed in HDR this year, Sky in the UK has launched HDR and many of the big streaming services support it including Disney+, Prime and Netflix. So as it slowly takes its place, we look at what it is and how it’s achieved in the camera and in production.

Neil Thompson, an Sony Independent Certified Expert, takes a seat in the CVP Common Room to lead us through HDR from the start and explain how giraffes are part of the equation. Dynamic Range makes up two thirds of HDR, so he starts by explaining what it is with an analogy to audio. When you turn up the speakers so they start to distort, that’s the top of your range. The bottom is silence – or rather what you can hear over the quiet hiss that all audio systems have. Similarly in cameras, you can have bright pixels which are a different brightness to the next which represents the top of your range, and the dithering blacks which are the bottom of your range. In video, if you go too bright, all pixels become white even if the subject’s brightness varies which the equivalent of the audio distortion.

With the basic explanation out of the way, Neil moves on to describing the amount or size of dynamic range (DR) which can be done either in stops, contrast ratio or signal to noise ratio. He compares ‘stops’ to a bucket of water with some sludge at the bottom where the range is between the top of sludge and the rim of the bucket. One stop, he explains, is a halving of the range. With the bucket analogy, if you can go half way down the bucket and still hit clear water, you have 1 stop of dynamic range. If you can then go a quarter down with clean water, you have 2 stops. By the time you get to 1/32nd you have 5 stops. If going to 1/64 of the height of the bucket means you end up in the sludge, your system has 5 stops of dynamic range. Reducing the sludge so there’s clear water at 1/64th the height, which in cameras means reducing the noise in the blacks, is one way of increasing the dynamic range of your acquisition.

Update: Unfortunately CVP choose to take down this video within 12 hours of this article going live. But there’s good news if you’re interested in HDR. Firstly, you can find the outline and some of the basics of the talk explained below. Secondly, at The Broadcast Knowledge there are plenty of talks discussing HDR! Here’s hoping CVP bring the video back.

If you would like to know how lenses fit into the equation of gathering light, check out this talk from Cannon’s Larry Thorpe.

Neil looks next at the range of light that we see in real life from sunlight to looking at the stars at night. Our eye has 14 stops of range, though with our iris, we can see the equivalent of 24 stops. Similarly, cameras use an iris to regulate the light incoming which helps move the restricted dynamic range of the camera into the right range of brightness for our shot.

Of course, once you have gathered the light, you need to display it again. Displays’ ability to produce light is measured in ‘nits’, which is the amount of light per metre squared. Knowing how many nits a displays helps you understand the brightness it can show with 1000 nits, currently, being a typical HDR display. Of course, dynamic range is as much about the blacks as the brightness. OLED screens are fantastic at having low blacks, though their brightness can be quite low. LEDs, conversely, Neil explains, can go very bright but the blacks do suffer. You have to also take into account the location of a display device to understand what range it needs. In a dim gallery you can spend longer caring about the blacks, but many places are so bright, the top end is much more important than the blacks.

With the acquisition side explained, Neil moves on to transmission of HDR and it’s like getting a giraffe on a tube train. Neil relates the already familiar ‘log profiles’. There are two HDR curves, known as transfer functions, PQ from Dolby and HLG (Hybrig Log Gamma). Neil looks at which profiles are best for each part of the production workflow and then explains how PQ differs from HLG in terms of expressing brightness levels. In HLG, the brightest part of the signal tells the display device to output as brightly as it can. A PQ signal, however, reserves the brightest signal for 10,000 nits – far higher than displays available today. This means that we need to do some work to deal with the situation where your display isn’t as bright as the one used to master the signal. Neil discusses how we do that with metadata.

Finishing off the talk, Neil takes questions from the audience, but also walks through a long list of questions he brought along including discussing ‘how bright is too bright?’, what to look for in an engineering monitor, lighting for HDR and costs.

Watch now!
Speakers

Neil Thompson Neil Thompson
Freelance Engineer & Trainer

Video: Using AMWA IS-06 for Flow Control on Professional Media Networks

In IP networks multicast flow subscription is usually based on a combination of IGMP (Internet Group Management Protocol) and PIM (Protocol Independent Multicast) protocols. While PIM allows for very efficient delivery of IP multicast data, it doesn’t provide bandwidth control or device authorisation.

To solve these issues on SMPTE ST 2110 professional media networks the NMOS IS-06 specification has been developed. It relies on a Software-Defined Networking, where traffic management application embedded in each single switch or router is replaced by a centralised Network Controller. This controller manages and monitors the whole network environment, making it bandwidth aware.

NMOS IS-06 specification provides a vendor agnostic Northbound interface from Network Controller to Broadcast Controller. IS-06 in conjunction with IS-04 (Discovery and Registration) and IS-05 (NMOS Device Connection Management) allows Broadcast Controller to automatically set up media flows between endpoints on the network, reserve bandwidth for flows and enforce network security. Broadcast Controller is also able to request network topology information from Network Controller, which can be used to create a user friendly graphic representation of the flows in the network.

In this presentation Rob Porter from Sony Europe explains the basics of NMOS IS-06, showing in details how setting up media flows with this specification fits into the IS-04 / IS-05 workflow. Rob emphasise that all AMWA NMOS specifications are completely open and available to anyone, allowing for interoperability between broadcast and network devices from different manufacturers.

The next speaker, Sachin Vishwarupe from Cisco Systems, focuses on the future works on IS-06, including provisioning feedback (such as insufficient bandwidth, no route available from sender to receiver or no management connectivity), flow statistics, security and grouping (similar to ”salvo” in SDI world).

There is also a discussion on extension of IS-06 specification for Network Address Translation (NAT), which would help to resolve problems caused by address conflicts e.g. when sharing resources between facilities.

You can find the slides here.

Watch now!

Speakers

Rob Porter Rob Porter
Project Manager – Advanced Technology Team
Sony Europe
Sachin Vishwarupe
Principal Engineer
Cisco Systems

Webinar: ATSC 3.0 Physical Layer and Data Link Layer Overview

ATSC 3.0 brings IP delivery to over-the-air TV marking a major change in delivery to the home. For the first time video, audio and other data is all delivered as network streams allowing services available to TV viewers at home to modernise and merge with online streaming services better matching the viewing habits of today. ATSC 3.0 deployments are starting in the USA and it has already been rolled out in South Korea for the XXIII Olympic Winter Games in 2018.

Whilst the move to IP is transformational, ATSC 3.0 delivers a whole slew of improvements to the ATSC standard for RF, bandwidth, Codecs and more. In this, the first of three webinars from the IEEE BTS focussing in on ATSC 3.0, we look at the physical layer with Luke Fay, Chair of the ATSC 3.0 group and also a Senior Manager of Technical Standards at Sony.

Click to register: Wednesday, 15th January, 2020. 11am ET / 16:00 GMT

What is the Physical Layer?
The physical layer refers to the method data gets from one place to another. In this case, we’re talking about transmission by air, RF. Whilst this isn’t, in some ways, as physical as a copper cable, we have to remember that, at a basic level, communication is about making a high voltage in place A change the voltage in place B. The message physically moves from A to B and the medium it uses and the way it manipulates that medium are what we refer to as the physical layer.

In this webinar, Luke will talk about System Discovery and Signalling, defined by document A/321 and the Physical Layer Protocol defined by A/322. Both freely available from the ATSC website. The webinar will finish with a Q&A. Let’s take a deeper look at some of the topics which will be covered.

Choice of modulation

ATSC 3.0 has chosen the COFDM modulation scheme over the previous 8VSB, currently used for first-generation ATSC broadcasts, to deliver data over the air from the transmitter. COFDM, stands for Coded Orthogonal Frequency Devision Multiplexing and has become the go-to modulation method for digital transmissions including for DAB, DAB+ and the DVB terrestrial, satellite and cable standards.

One of the reasons for its wide adoption is that COFDM has guard bands; times when the transmitter is guaranteed not to send any data. This allows the receiver some time to receive any data which comes in late due to multi-path reflections or any other reason. This means that for COFDM, you get better performance if you run a network of nearby transmitters on the same frequency – known as a Single Frequency Network (SFN). A transmitters signal from further away will arrive later, and if in the guard interval, will be used to re-inforce the directly received signal. This means that, counter-intuitively from analogue days, running an SFN actually helps improve reception.

Multiple operating points to match the business case
Another important feature of ATSC 3.0 at the physical layer is the ability to be able to choose the robustness of the signal and have multiple transmissions simultaneously using different levels of robustness. These multiple transmissions are called pipes. As many of us will be familiar with, when transmitting a high bandwidth, the signal can be fragile and easily corrupted by interference. Putting resilience into the signal uses up bandwidth either due using some of the capacity to put error checking and error recovery data in or just by slowing down the rate the signal is sent which, of course, means not as many bits can be sent in the same time window.

Because bandwidth and resilience are a balancing act with each one fighting against the other, it’s important for stations to be able to choose what’s right for them and their business case. Having a high robustness signalm for penetration indoors can be very useful for targeting reception on mobile devices and ATSC 3.0 can actually achieve reception when the signal is below the noise, i.e. a negative signal to noise ratio. A higher bandwidth service delivering UHD at around 20Mbps can be achieved, however, by using 64 instead of 16 QAM.

Register now!
Speaker

Luke Fay
Chairman, ATSC Technology Group 3,
Senior Manager Technical Standards, Sony Home Entertainment & Sound Products – America

Video: Talk 2110

Is the industry successfully delivering what we need with SMPTE’s ST 2110 suite of standards? What are the benefits of IP and how can we tackle the difficulties?

In this panel from Broadcast Solutions’ Innovation Day, we hear from 5 vendors understanding their perspectives and plans for the future. Claus Pfeifer from Sony say they have now 60 sites up and running in IP. Lawo’s Phil Myers follows up saying “People know they have to go IP, it’s a matter of when they go IP.”

Whilst this is a positive start, the panel moves on to talking briefly about difficulties implementing SMPTE ST 2110. Jan Eveleens from Riedel points out many of the issues will go as we are waiting for technology to catch up regarding CPUs and bandwidth. We no longer have the same processing issues we used to for audio. Similarly with video, technology will improve and remove many of the challenges. Phil Myers feels that cloud implementation issues are not a large problem at the moment as he sees a move to bring equipment into private clouds rather than public. This way they are doing ‘remote production for buildings’.

After each vendor outlined their future plans for IP, Zoltan highlighted that IP allows NDI to co-exist with ST 2110. Many may want to use 2110 for high end sports, for others NDI fits well. Then panel felt that a concerning area of IP is the worry of how to fix problems. The knowledge level is different from country to country. So vendors not only need to work on education about IP, both for NDI and 2110, but they need to do this in a focussed way for the different markets.

As the panel comes towards the end, Claus feels that the industry started to talk too early about pure technology. “Did not discuss enough about the business benefits.” he explains such as remote production and more efficient use of equipment – avoiding ‘sleeping Capex’. Installing IP makes a lot of sense for large-scale systems. Recently broadcasters have been working at a scale requiring much more than 1024 squared routers roughly where SDI routers top out. But also, these large systems tend to have a life of over 10 years. Faced with SDI development, particularly in routers, is slowing down or stopping, for these long-lived systems it makes much more sense to use IP.

Watch now!
Speakers

Jan Eveleens Jan Eveleens
Director Business Development Video Solutions,
Riedel
Joachim Kuhnen Joachim Kuhnen
Strategic Solution Manager EMEA
Imagine Communications
Zoltan Matula Zoltan Matula
Regional Sales Manager Central Europe,
Newtek
Phil Myers Phil Myers
Chief Technology Officer, Chair of the Advisory Board,
Lawo
Claus Pfeifer Claus Pfeifer
Head of Connected Content Acquisition – Media Solutions,
Sony
David Davies David Davies
Moderator