Now Available On Demand
UHD transmissions have been available for many years now and form a growing, albeit slow-growing, percentage of channels available. The fact that major players such as Sky and BT Sports in the UK, NBCUniversal and the ailing DirecTV in the US, see fit to broadcast sports in UHD shows that the technology is trusted and mature. But given the prevalence of 4K in films from Netflix, Apple TV+ streaming is actually the largest delivery mechanism for 4K/UHD video into the home.
Following on from last week’s DVB webinar, now available on demand, this webinar from the DVB Project replaces what would have been part of the DVB World 2020 conference and looks at the work that’s gone into getting UHD to were it is now in terms of developing HEVC (also known as H.265), integrating it into broadcast standards plus getting manufacturer support. It then finishes by looking at the successor to HEVC – VVC (Versatile Video Codec)
The host, Ben Swchwarz from the Ultra HD Forum, first introduces Ralf Schaefer who explores the work that was done in order to make UHD for distribution a reality. He’ll do this by looking at the specifications and standards that were created in order to get us where we are today before looking ahead to see what may come next.
Yvonne Thomas from the UK’s Digital TV Group is next and will follow on from Ben by looking at codecs for video and audio. HEVC is seen as the go-to codec for UHD distribution. As the uncompressed bitrate for UHD is often 12Gbps, HEVC’s higher compression ratio compared to AVC and relatively wide adoption makes it a good choice for wide dissemination of a signal. But UHD is more than just video. With UHD and 4K services usually carrying sports or films, ‘next generation audio‘ is really important. Yvonne looks at the video and audio aspects of delivering HEVC and the devices that need to receive it.
Finally we look at VVC, also known as H.266, the successor to HEVC, also known as H.265. ATEME’s Sassan Pejhan gives us a look into why VVC was created, where it currently is within MPEG standardisation and what it aims to achieve in terms of compression. VVC has been covered previously on The Broadcast Knowledge in dedicated talks such as ‘VVC, EVC, LCEVC, WTF?’, ‘VVC Standard on the Final Stretch’, and AV1/VVC Update.
Watch now!
Speakers
![]() |
Ben Schwarz Communication Working Group Chair, Ultra HD Forum |
![]() |
Ralf Schaefer VP Standards R&I InterDigital Inc. |
![]() |
Yvonne Thomas Strategic Technologist DTG (Digital TV Group) |
![]() |
Sassan Pejhan VP Technology, ATEME |
Video: Buffer Sizing and Video QoE Measurements at Netflix
At a time when Netflix is cutting streaming quality to reduce bandwidth, we take a look at the work that’s gone into optimising latency within the switch at ISPs which was surprisingly high.
Bruce Spang interned at Netflix and studied the phenomenon of unexpected latency variation within the netflix caches they deploy at ISPs to reduce latency and bandwidth usage. He starts by introducing us to the TCP buffering models looking at how they work and what they are trying to achieve with the aim of identifying how big it is supposed to be. The reason this is important is that if it’s a big buffer, you may find that data takes a long time to leave the buffer when it gets full, thus adding latency to the packets as they travel through. Too small, of course, and packets have to be dropped. This creates more rebuffing which impacts the ABR choice leading to lower quality.
Bruce was part of an experiment that studied whether the buffer model in use behaved as expected and whist he found that it did most of the time, he did find that video performance varied which was undesirable. To explain this, he details the testing they did and the finding that congestion, as you would expect, increases latency more during a congested time. Moreover, he showed that a 500MB had more latency than 50MB.
To explain the unexplained behaviour such as long-tail content having lower latency than popular content, Bruce explains how he looked under the hood of the router to see how VOQs are used to create queues of traffic and how they work. Seeing the relatively simply logic behind the system, Bruce talks about the results they’ve achieved working with the vendor to improve the buffering logic.
Watch now!
Speakers
![]() |
Bruce Spang PhD Student, Stanford |
Video: Harness SSAI’s Superpowers
Server-side Ad Insertion (SSAI) is a great option for streaming services delivering video to a wide variety of devices and for those who need to avoid ad blockers. Whilst ad insertion can happen in the player, this mechanism can be interfered with allowing users to avoid ads. Whilst client-side ad insertion can much more easily create a unique stream for each client, dynamic SSAI can now do the same with a better user experience.
This panel from the OTT Leadership Summit at Streaming Media West 2019 brings together Disney, WarnerMerdia and Crunchyroll to share their experiences with SSAI. They discuss beaconing, ad standards, scaling, SCTE and more.
Beaconing goes hand in hand with ad playback providing metrics on what happened. When you perform certain actions, the player will reach out to a URL. This can be used to indicate such things as users skipping or pausing a video. The beacon information can then be used to verify how much of which ads were seen by whom and charge advertisers accordingly.
The panel moves on to discussing scaling using live sports as an example and cover questions to ask vendors to ensure you and they are ready for maximum scale. Bandwidth, is declared the biggest challenge, but a less obvious problem is that your upstream ad providers can’t always scale well. If you rely on calls from your server to others, then it’s vital to understand their scaling capacity and strategy. They discuss issues with losing beacons when operating at scale and the need for detailed logging and debugging in order to spot errors and reconcile the results.
Some time is next spent on VPAID and VAST 4 which are both messaging specifications to allow ad servers to tell applications which ads to play. The panel discusses the pros and cons in their use for SSAI where the stitcher needs to reach out to and ad server in real time to find out which ads to play.
At the end of the discussion, the panel takes questions from the floor but not before discussing SCTE Markers and ‘content conditioning’ which surrounds taking care of your source videos and encoder such that the two assets fit together properly at I-frame boundaries.
Watch now!
Speakers
![]() |
Robert Jameson Technical Director, Media Enablement Turner | WarnerMedia |
![]() |
Stephen Gray Director, Ad Tech Systems Walt Disney Direct-to-Consumer & International |
![]() |
Michael Dale VP Engineering, Crunchyroll |
![]() |
Nadine Krefetz Consultant, Reality Software Contributing Editor, Streaming Media |
Video: Reinventing Intercom with SMPTE ST 2110-30
Intercom systems form the backbone of any broadcast production environment. There have been great strides made in the advancement of these systems, and matrix intercoms are very mature solution now, with partylines, IFBs and groups, wide range of connectivity options and easy signal monitoring. However, they have flaws as well. Initial cost is high and there’s lack of flexibility as system size is limited by the matrix port count. It is possible to trunk multiple frames, but it is difficult, expensive and takes rack space. Moreover, everything cables back to a central matrix which might be a single point of failure.
In this presentation, Martin Dyster from The Telos Alliance looks at the parallels between the emergence of Audio over IP (AoIP) standards and the development of products in the intercom market. First a short history of Audio over IP protocols is shown, including Telos Livewire (2003), Audinate Dante (2006), Wheatstone WheatNet (2008) and ALC Networks Ravenna (2010). With all these protocols available a question of interoperability has arisen – if you try to connect equipment using two different AoIP protocols it simply won’t work.
In 2010 The Audio Engineering Society formed the x192 Working Group which was the driving force behind the AES67. This standard was ratified in 2013 and allowed interconnecting audio equipment from different vendors. In 2017 SMPTE adapted AES67 as the audio format for ST 2110 standard.
Audio over IP replaces the idea of connecting all devices “point-to-point” with multicast IP flows – all devices are connected via a common fabric and all audio routes are simply messages that go from one device to another. Martin explains how Telos were inspired by this approach to move away from the matrix based intercoms and create a distributed system, in which there is no central core and DSP processing is built in intercom panels. Each panel contains audio mix engines and a set of AES67 receivers and transmitters which use multicast IP flows. Any ST 2110-30 / AES67 compatible devices present on the network can connect with intercom panels without an external interface. Analog and other baseband audio needs to be converted to ST 2110-30 / AES67.
Martin finishes his presentation by highlighting advantages of AoIP intercom systems, including lower entry and maintenance cost, easy expansion (multi studio or even multi site) and resilient operation (no single point of failure). Moreover, adaptation of multicast IP audio flows removes the need for DAs, patch bays and centralised routers, which reduces cabling and saves rack space.
If you want to refresh your knowledge about AES67 and ST2110-30, we recomend the Video: Deep Dive into SMPTE ST 2110-30, 31 & AES 67 Audio presentation by Leigh Whitcomb.
Speaker
![]() |
Martin Dyster VP Business Development The Telos Alliance |