Webinar: Networking Fundamentals


Date: Thursday 12th December, 1pm EST / 18:00 GMT

Networking is increasingly important throughout the broadcast chain. This webcast picks out the fundamentals that underpin SMPTE ST 2110 and that help deliver video streaming services. We’ll piece them together and explain how they work, leaving you with more confidence in talking about and working with technologies such as multicast video and HTTP Live Streaming (HLS).

Register now!
Speaker

Russell Trafford-Jones Russell Trafford-Jones
Editor, https://TheBroadcastKnowledge.com
Manager, Support & Services, Techex

Video: Introducing Low-Latency HLS

HLS has taken the world by storm since its first release 10 years ago. Capitalising on the already widely understood and deployed technologise already underpinning websites at the time, it brought with it great scalability and the ability to seamlessly move between different bitrate streams to help deal with varying network performance (and computer performance!)

HLS has continued to evolve over the years with the new versions being documented as RFC drafts under the IETF. It’s biggest problem for today’s market is its latency. As originally specified, you were guaranteed at least 30 seconds latency and many viewers would see a minute. This has improved over the years, but only so far.

Low-Latency HLS (LL-HLS) is Apple’s answer to the latency problem. A way of bringing down latency to be comparable with broadcast television for those live broadcast where immediacy really matters.

This talk from Apple’s HLS Technical Lead, Roger Pantos, given at Apple’s WWDC conference this year goes through the problems and the solution, clearly describing LL-HLS. Over the following weeks here on The Broadcast Knowledge we will follow up with some more talks discussing real-world implementations of LL-HLS, but to understand them, we really need to understand the fundamental proposition.

Apple has always been the gatekeeper to HLS and this is one reason the MPEG DASH exists; a streaming standard that is separate to any one corporation and has the benefits of being passed by a standards body (MPEG). So who better to give the initial introduction.

HLS is a chunk-based streaming protocol meaning that the illusion of a perfect stream of data is given by downloading in quick succession many different files and it’s the need to have a pipeline of these files which causes much of the delay, both in creating them and in stacking them up for playback. LL-HLS uses techniques such as reducing chunk length and moving only parts of them in order to drastically reduce this intrinsic latency.

Another requirement of LL-HLS is HTTP/2 which is an advance on HTTP bringing with it benefits such as having multiple requests over a single HTTP connect thereby reducing overheads and request pipelining.

Roger carefully paints the whole picture and shows how this is intended to work. So while the industry is still in the midst of implementing this protocol, take some time to understand it from the source – from Apple.

Watch now!
Download the presentation

Speaker

Roger Pantos Roger Pantos
HLS Technical Lead,
Apple

Video: Specification of Live Media Ingest

“Standardisation is more than just a player format”. There’s so much to a streaming service than the video, a whole ecosystem needs to work together. In this talk from Comcast’s Mile High Video 2019, we see how different parts of the ecosystem are being standardised for live ingest.

RTMP and Smooth streaming are being phased out – without proper support for HEVC, VVC, HDR etc. they are losing relevance as well as, in the case of RTMP, support from the format itself. Indeed it’s clear that fragmented MP4 (fMP4) and CMAF are taking hold in their place so it makes sense for a new ingest standard to coalesce around these formats.

Rufael Mekuria from Unified streaming explains this effort to create a spec around live media ingest that is happening as part of MPEG DASH-IF. The work itself started at the end of 2017 with the aim of publishing summer 2019 supporting CMAF and DASH/HLS interfaces.

Rufael explains CMAF ingest used HTTP post to move each media stream to the origin packager. The tracks are separated into video, audio, timed text, subtitle and timed metadata. They are all transferred on separate tracks and is compatible with future codecs. He also covers security and timed text before covering DASH/HLS ingest which can also contain CMAF because HLS contains the capability to contain CMAF.

Reference software is available along with the <a href=”http://”https://dashif-documents.azurewebsites.net/Ingest/master/DASH-IF-Ingest.pdf” rel=”noopener noreferrer” target=”_blank”>specification.

Watch now!
Speaker

Rufael Mekuria Rufael Mekuria
Head of Research & Standardisation,
Unified Streaming

Video: WAVE (Web Application Video Ecosystem) Update

With wide membership including Apple, Comcast, Google, Disney, Bitmovin, Akamai and many others, the WAVE interoperability effort is tackling the difficulties web media encoding, playback and platform issues utilising global standards.

John Simmons from Microsoft takes us through the history of WAVE, looking at the changes in the industry since 2008 and WAVE’s involvement. CMAF represents an important milestone in technology recently which is entwined with WAVE’s activity backed by over 60 major companies.

The WAVE Content Specification is derived from the ISO/IEC standard, “Common media application format (CMAF) for segmented media”. CMAF is the container for the audio, video and other content. It’s not a protocol like DASH, HLS or RTMP, rather it’s more like an MPEG 2 transport stream. CMAF nowadays has a lot of interest in it due to its ability to delivery very low latency streaming of less than 4 seconds, but it’s also important because it represents a standardisation of fMP4 (fragmented MP4) practices.

The idea of standardising on CMAF allows for media profiles to be defined which specify how to encapsulate certain codecs (AV1, HEVC etc.) into the stream. Given it’s a published specification, other vendors will be able to inter-operate. Proof of the value of the WAVE project are the 3 amendments that John mentions issued from MPEG on the CMAF standard which have come directly from WAVE’s work in validating user requirements.

Whilst defining streaming is important in terms of helping in-cloud vendors work together and in allowing broadcasters to more easily build systems, its vital the decoder devices are on board too, and much work goes into the decoder-device side of things.

On top of having to deal with encoding and distribution, WAVE also specifies an HTML5 APIs interoperability with the aim of defining baseline web APIs to support media web apps and creating guidelines for media web app developers.

This talk was given at the Seattle Video Tech meetup.

Watch now!
Slides from the presentation
Check out the free CTA specs

Speaker

John Simmons John Simmons
Media Platform Architect,
Microsoft