Video: Low Latency Streaming

There are two phases to reducing streaming latency. One is to optimise the system you already have, the other is to move to a new protocol. This talk looks at both approaches achieving parity with traditional broadcast media through optimisation and ‘better than’ by using CMAF.

In this video from the Northern Waves 2019 conference, Koen van Benschop from Deutsche Telekom examines the large and low-cost latency savings you can achieve by optimising your current HLS delivery. With the original chunk sizes recommended by Apple being 10 seconds, there are still many services out there which are starting from a very high latency so there are savings to be had.

Koen explains how the total latency is made up by looking at the decode, encode, packaging and other latencies. We quickly see that the player buffer is one of the largest, the second being the encode latency. We explore the pros and cons of reducing these and see that the overall latency can fall to or even below traditional broadcast latency depending, of course, on which type (and which country’s) you are comparing it too.

While optimising HLS/DASH gets you down to a few seconds, there’s a strong desire for some services to beat that. Whilst the broadcasters themselves may be reticent to do this, not wanting to deliver online services quicker than their over-the-air offerings, online sports services such as DAZN can make latency a USP and deliver better value to fans. After all, DAZN and similar services benefit from low-second latency as it helps bring them in line with social media which can have very low latency when it comes to key events such as goals and points being scored in live matches.

Stefan Arbanowski from Fraunhofer leads us through CMAF covering what it is, the upcoming second edition and how it works. He covers its ability to use .m3u8 (from HLS) and .mpd (from DASH) playlist/manifest files and that it works both with fMP4 and ISO BMFF. One benefit from DASH is it’s Common Encryption standard. Using this it can work with PlayReady DRM, Fairplay and others.

Stefan then takes a moment to consider WebRTC. Given it proposes latency of less than one second, it can sound like a much better idea. Stefan outlines concerns he has about the ability to scale above 200,000 users. He then turns his attention back to CMAF and outlines how the stream is composed and how the player logic works in order to successfully play at low latency.

Watch now!
Speakers

Koen van Benschop Koen van Benschop
Senior Manager TV Headend and DRM,
Deutsche Telekom
Stefan Arbanowski Stefan Arbanowski
Director Future Applications and Media,
Fraunhofer FOKUS

Video: An Overview of the ISO Base Media File Format

ISO BMFF a standardised MPEG media container developed from Apple’s Quicktime and is the basis for cutting edge low-latency streaming as much as it is for tried and trusted mp4 video files. Here we look into why we have it, what it’s used for and how it works.

ISO BMFF provides a structure to place around timed media streams whilst accommodating the metadata we need for professional workflows. Key to its continued utility is its extensible nature allowing additional abilities to be added as they are developed such as adding new codecs and metadata types.

ATSC 3.0’s streaming mechanism MMT is based on ISO BMFF as well as the low-latency streaming format CMAF which shows that despite being over 18 years old, the ISO BMFF container is still highly relevant.

Thomas Stockhammer is the Director of Technical Standards at Qualcomm. He explains the container format in structure and origin before explaining why it’s ideal for CMAF’s low-latency streaming use case, finishing off with a look at immersive media in ISO BMFF.

Watch now!

Speaker

Thomas Stockhammer Thomas Stockhammer
Director Technical Standards,
Qualcomm

Video: Next Generation Broadcast Platform – ATSC 3.0

Continuing our look at ATSC 3.0, our fifth talk straddles technical detail and basic business cases. We’ve seen talks on implementation experience such as in Chicago and Phoenix and now we look at receiving the data in open source.

We’ve covered before the importance of ATSC 3.0 in the North American markets and the others that are adopting it. Jason Justman from Sinclair Digital states the business cases and reasons to push for it despite it being incompatible with previous generations. He then discusses what Software Defined Radio is and how it fits in to the puzzle. Covering the early state of this technology.

With a brief overview of the RF side of ATSC 3.0 which itself is a leap forward, Jason explains how the video layer benefits. Relying on ISO BMMFF, Jason introduces MMT (MPEG Media Transport) explaining what it is and why it’s used for ATSC 3.0.

The next section of the talk showcases libatsc3 whose goal is to open up ATSC 3.0 to talented Software Engineers and is open source which Jason demos. The library allows for live decoding of ATSC 3.0 including MMT material.

Finishing his talk with a Q&A including SCTE 34 and an interesting comparison between DVB-T2 and ATSC 3.0 makes this a very useful talk to fill in technical gaps that no other ATSC 3.0 talk covers.

Complete slide pack

Watch now!
Speakers

Jason Justman Jason Justman
Senior Principal Architect,
Sinclair Digital