Video: WAVE (Web Application Video Ecosystem) Update

With wide membership including Apple, Comcast, Google, Disney, Bitmovin, Akamai and many others, the WAVE interoperability effort is tackling the difficulties web media encoding, playback and platform issues utilising global standards.

John Simmons from Microsoft takes us through the history of WAVE, looking at the changes in the industry since 2008 and WAVE’s involvement. CMAF represents an important milestone in technology recently which is entwined with WAVE’s activity backed by over 60 major companies.

The WAVE Content Specification is derived from the ISO/IEC standard, “Common media application format (CMAF) for segmented media”. CMAF is the container for the audio, video and other content. It’s not a protocol like DASH, HLS or RTMP, rather it’s more like an MPEG 2 transport stream. CMAF nowadays has a lot of interest in it due to its ability to deliver very low latency streaming of less than 4 seconds, but it’s also important because it represents a standardisation of fMP4 (fragmented MP4) practices.

The idea of standardising on CMAF allows for media profiles to be defined which specify how to encapsulate certain codecs (AV1, HEVC etc.) into the stream. Given it’s a published specification, other vendors will be able to inter-operate. Proof of the value of the WAVE project is the 3 amendments that John mentions issued from MPEG on the CMAF standard which have come directly from WAVE’s work in validating user requirements.

Whilst defining streaming is important in terms of helping in-cloud vendors work together and in allowing broadcasters to more easily build systems, it’s vital the decoder devices are on board too, and much work goes into the decoder-device side of things.

On top of having to deal with encoding and distribution, WAVE also specifies an HTML5 APIs interoperability with the aim of defining baseline web APIs to support media web apps and creating guidelines for media web app developers.

This talk was given at the Seattle Video Tech meetup.

Watch now!
Slides from the presentation
Check out the free CTA specs

Speaker

John Simmons John Simmons
Media Platform Architect,
Microsoft

Video: Streaming Live Events: When it must be alright on the night

Live Streaming is an important part of not only online viewing, but increasingly of broadcast in general. It’s well documented that live programming is key to keeping linear broadcast’s tradition of ‘everyone watching at once’ which has been diluted – for both pros and cons – by non-linear viewing in recent years.

This panel, as part of IBC’s Content Everywhere, looks at the drivers behind live streaming, how it’s evolving and its future. Bringing together ultra-low-latency platform nanocosmos with managed service provider M2A Media and video player specialists Visual On, Editor of The Broadcast Knowledge, Russell Trafford-Jones starts the conversation asking what gamification is and how this plays in to live streaming.

nanocosmos’s Oliver Lietz explains how gamification is an increasing trend in terms of not only monetising existing content but is a genre in and of itself providing content which is either entirely a game or has a significant interactive element. With such services, it’s clear that latency needs to be almost zero so his company’s ability to deliver one-second latency is why he has experience in these projects.

We hear also from VisualOn’s Michael Jones who explains the low-latency service they were involved in delivering. Here, low-latency CMAF was used in conjunction with local synced-screen technology to ensure that not only was latency low, but second screen devices were not showing video any earlier/later than the main screen. The panel then discussed the importance of latency compared to synchronised viewing and where ultra-low latency was unnecessary.

Valentijn Siebrands from M2A talks about the ability to use live streaming and production in the cloud to deliver lower-cost sports events but also deliver new types of programming. Valentijn then takes us into the topic of analytics, underlining the importance of streaming analytics which reveal the health of your platform/infrastructure as much as the analytics which are most usually talked about; those which tell you the quality of experience your viewers are having and their activities on your app.

The talk concludes with a look to the future, talking about the key evolving technologies of the moment and how they will help us move forward between now and IBC’s Content Everywhere Hub in 2021.

Watch now!

Speakers

Oliver Lietz Oliver Lietz
CEO & Founder,
nanocosmos
Michael Jones Michael Jones
Former SVP and Head of Business Development,
VisualOn Inc
Valentijn Siebrands Valentijn Siebrands
Solutions Architect,
M2A Media
Russell Trafford-Jones Russell Trafford-Jones – Moderator
Manager, Support & Services – Techex
Executive Member – IET Media Technical Network

Video: From WebRTC to RTMP

With the demise of RTMP, what can WebRTC – its closest equivalent – learn from it? RTC stands for Real-Time Communications and hails from the video/voice teleconferencing world. RTC traditionally has ultra-low latency (think sub-second; real-time) so as broadcasters and streaming companies look to reduce latency it’s the obvious technology to look at. However, RTC comes from a background of small meetings, mixed resolutions, mixed bandwidths and so the protocols underpinning it can be lacking what broadcast-style streamers need.

Nick Chadwick from MUX looks at the pros and cons of the venerable RTMP (Real Time Messaging Protocol). What was in it that was used and unused? What did need that it didn’t have? What gap is being left by its phasing out?

Filling these increasing gaps is the focus of the streaming community and whether that comes through WebRTC, fragmented MP4 delivered over web sockets, Low-Latency HLS, Apple’s Low-Latency HLS, SASH, CMAF or something else…it still needs to be fulfilled.

Nick finishes with two demos which show the capabilities of WebRTC which outstrip RTMP – live mixing on a browser. WebRTC clearly has a future for more adventurous services which don’t simply want to deliver a linear channel to sofa-dwelling humans. But surely Nick’s message is WebRTC needs to step up to the plate for broadcasters, in general, to enable them to achieve < 1-second end-to-end latency in a way which is compatible with broadcast workflows.

Watch now!
Speaker

Nick Chadwick Nick Chadwick
Software Engineer,
Mux

Video: Bandwidth Prediction in Low-Latency Chunked Streaming

How can we overcome one of the last, big, problems in making CMAF generally available: making ABR work properly.

ABR, Adaptive Bitrate is a technique which allows a video player to choose what bitrate video to download from a menu of several options. Typically, the highest bitrate will have the highest quality and/or resolution, with the smallest files being low resolution.

The reason a player needs to have the flexibility to choose the bitrate of the video is mainly due to changing network conditions. If someone else on your network starts watching some video, this may mean you can no longer download video quick enough to keep watching in full quality HD and you may need to switch down. If they stop, then you want your player to switch up again to make the most of the bitrate available.

Traditionally this is done fairly simply by measuring how long each chunk of the video takes to download. Simply put, if you download a file, it will come to you as quickly as it can. So measuring how long each video chunk takes to get to you gives you an idea of how much bandwidth is available; if it arrives very slowly, you know you are close to running out of bandwidth. But in low-latency streaming, your are receiving video as quickly as it is produced so it’s very hard to see any difference in download times and this breaks the ABR estimation.

Making ABR work for low-latency is the topic covered by Ali in this talk at Mile High Video 2019 where he presents some of the findings from his recently published paper which he co-authored with, among others, Bitmovin’s Christian Timmerer and which won the DASH-IF Excellence in DASH award.

He starts by explaining how players currently behave with low-latency ABR showing how they miss out on changing to higher/lower renditions. Then he looks at the differences on the server and for the player between non-low-latency and low-latency streams. This lays the foundation to discuss ACTE – ABR for Chunked Transfer Encoding.

ACTE is a method of analysing bandwidth with the assumption that some chunks will be delivered as fast as the network allows and some won’t be. The trick is detecting which chunks actually show the network speed and Ali explains how this is done and shows the results of their evaluation.

Watch now!

Speaker

Ali C. Begen Ali C. Begen
Technical Consultant and
Computer Science Professor