Video: DASH Updates

MPEG DASH is a standardised method for encapsulating media for streaming similar to Apple’s HLS. Based on TCP, MPEG DASH is a widely compatible way of streaming video and other media over the internet.

MPEG DASH is now on its 3rd edition, its first standard being in 2011. So this talk starts by explaining what’s new as of July 2019 in this edition. Furthermore, there are amendments already worked on which are soon to add more features.

Iraj Sodagar explains Service Descriptors which will be coming that allow the server to encapsulate metadata for the player which describes how the publisher intended to show the media. Maximum and minimum latency and quality is specified. for instance. The talk explains how these are used and why they are useful.

Another powerful metadata feature is the Initialization Set, Group and Presentation which gives the decoder a ‘heads up’ on what the next media will need in terms of playback. This allows the player to politely decline to play the media if it can’t display it. For instance, if a decoder doesn’t supply AV1, this can be identified before needing to attempt a decode or download a chunk.

Iraj then explains what will be in the 4th edition including the above, signalling leap seconds and much more. This should be published over the next few months.

Amendement 1 is working towards a more accurate timing model of events and defining a specific DASH profile for CMAF (the low-latency streaming technology based on DASH) which Iraj explains in detail.

Finishing off with session based DASH operations, a look over the DASH workplan/roadmap, ad insertion, event and timed metadata processing, this is a great, detailed look at the DASH of today and of 2020.

Watch now!
Speaker

Iraj Sodagar Iraj Sodagar
Independant Consultant

Video: Towards a healthy AV1 ecosystem for UGC platforms


Twitch is an ambassador for new codecs and puts its money where its mouth is; it is one of the few live streaming platforms which streams with VP9 – and not only at, with cloud FPGA acceleration thanks to Xylinx’s acquisition of NGCODEC.

As such, they have a strong position on AV1. With such a tech savvy crowd, they stream most of their videos at the highest bitrate (circa 6mbps). With millions of concurrent videos, they are highly motivated to reduce bandwidth where they can and finding new codecs is one way to do that.

Principal Research Engineer, Yueshi discusses Twitch’s stance on AV1 and the work they are doing to contribute in order to get the best product at the end of the process which will not only help them, but the worldwide community. He starts by giving an overview of Twitch which, while many of us are familiar with the site, the scale and needs of the site may be new information and drive the understanding of the rest of the talk.

Reduction in bitrate is a strong motivator, but also the fact that supporting many codecs is a burden. AV1 promises a possibility of reducing the number of supported codecs/formats. Their active contribution in AV1 is also determined by the ‘hand wave’ latency; a simple method of determining the approximate latency of a link which is naturally very important to a live streaming platform. This led to Twitch submitting a proposal for SWITCH_FRAME which is a technique, accepted in AV1, which allows more frequent changes by the player between the different quality/bitrate streams available. This results in a better experience for the user and also reduced bitrate/buffers.

YueShi then looks at the projected AV1 deployment roadmap and discusses when GPU/hardware support will be available. The legal aspect of AV1 – which promises to be a free-to-use codec is also discussed with the news that a patent pool has formed around AV1.

The talk finishes with a Q&A.

Watch now!

Speakers

Yueshi Shen Yueshi Shen
Principal (Level 7) Research Engineer & Engineering Manager,
Twitch

Video: WAVE (Web Application Video Ecosystem) Update

With wide membership including Apple, Comcast, Google, Disney, Bitmovin, Akamai and many others, the WAVE interoperability effort is tackling the difficulties web media encoding, playback and platform issues utilising global standards.

John Simmons from Microsoft takes us through the history of WAVE, looking at the changes in the industry since 2008 and WAVE’s involvement. CMAF represents an important milestone in technology recently which is entwined with WAVE’s activity backed by over 60 major companies.

The WAVE Content Specification is derived from the ISO/IEC standard, “Common media application format (CMAF) for segmented media”. CMAF is the container for the audio, video and other content. It’s not a protocol like DASH, HLS or RTMP, rather it’s more like an MPEG 2 transport stream. CMAF nowadays has a lot of interest in it due to its ability to delivery very low latency streaming of less than 4 seconds, but it’s also important because it represents a standardisation of fMP4 (fragmented MP4) practices.

The idea of standardising on CMAF allows for media profiles to be defined which specify how to encapsulate certain codecs (AV1, HEVC etc.) into the stream. Given it’s a published specification, other vendors will be able to inter-operate. Proof of the value of the WAVE project are the 3 amendments that John mentions issued from MPEG on the CMAF standard which have come directly from WAVE’s work in validating user requirements.

Whilst defining streaming is important in terms of helping in-cloud vendors work together and in allowing broadcasters to more easily build systems, its vital the decoder devices are on board too, and much work goes into the decoder-device side of things.

On top of having to deal with encoding and distribution, WAVE also specifies an HTML5 APIs interoperability with the aim of defining baseline web APIs to support media web apps and creating guidelines for media web app developers.

This talk was given at the Seattle Video Tech meetup.

Watch now!
Slides from the presentation
Check out the free CTA specs

Speaker

John Simmons John Simmons
Media Platform Architect,
Microsoft

Video: Simplifying OTT Video Delivery With SCTE 224

Life used to be simple; you’d fire up your camera, point it at a presenter and it would be fed to the transmitter network. When on-going funding came into play, we wanted each transmitter to be able to show local ads and so, after many years, SCTE-35 was born to do exactly that. In today’s world, however, simply telling a transmitter when to switch doesn’t cut it. To deliver the complex workflows that both linear and OTT delivery demand, SCTE 224 has arrived on the scene which provides very comprehensive scheduling and switching.

Jean Macher, from Harmonic explains this needs for SCTE 224 and what it delivers. For instance, a lot for 224 is devoted to controlling the US-style blackouts where viewers close to a sports game can’t watch the game live. Whilst this is relatively easy to deal with in the US for local terrestrial transmitters, in OTT, this is a new ability. Traditionally, geo-location of IP addresses is needed for this to work where each IP address is registered against a provider. If this provider is Chinese, then at the very least, you should be able to say that this IP address is in China. However, for ISPs who have an interest in the programming, they can bring in to effect their own data in order to have very accurate geo-location data.

SCTE 224, however, isn’t just able blackouts. It also transmits accurate, multi-level, schedule information which helps scheduling complex ad breaks providing detailed, frame accurate, local ad insertion.

It shouldn’t be thought that SCTE 35 and SCTE 224 are mutually exclusive. SCTE 35 can provide very accurate updates to unscheduled programmes and delays, where the 224 information still carries the rich metadata.

To finish up the talk, Jean looks at a specific example of the implementation and how SCTE 224 has been updated in recent years.

Watch now!

Speakers

Jean Macher Jean Macher
Director, Market Development – Broadcast
Harmonic Inc.