Video: ABR Streaming and CDN Performance

Hot on the heel’s of yesterday’s video all about Adaptive Bitrate (ABR) streaming we have research engineer Yuriy Reznik from Brightcove looking at the subject in detail. We outlined the use of ABR yesterday showing how it is fundamental to online streaming.

Brightcove, an online video hosting platform with its own video player, has a lot of experience of delivery over the CDN. We saw yesterday the principles that the player, and to an extent the server, can use to deal with changing network (and to an extent changing client CPU usage) by going up and down through the ABR ladder. However this talk focusses on how the CDN in the middle complicates matters as it tries its best to get the right chunks in the right place at the right time.

How often are there ‘cache misses’ where the right file isn’t already in place? And how can you predict what’s necessary?

Yuriy even goes in to detail about how to work out when HEVC deployment makes sense for you. After all, even if you do deploy HEVC – do you need to do it for all assets? And if you do only deploy for some assets, how do you know which? Also, when does it make sense to deploy CMAF? In this talk, we hear the answers.

The slides for this talk

Watch the video now!

Speaker

Yuriy Reznik Yuriy Reznik
VP, Research
Brightcove

Video: Adaptive Bitrate Algorithms: How They Work

Streaming on the net relies on delivering video at a bandwidth you can handle. Called ‘Adaptive Bitrate’ or ABR, it’s hardly possible to think of streaming without it. While the idea might seem simple initially – just send several versions of your video – it quickly gets nuanced.

Streaming experts Streamroot take us through how ABR works at Streaming Media East from 2016. While the talk is a few years old, the facts are still the same so this remains a useful talk which not only introduces the topic but goes into detail on how to implement ABR.

The most common streaming format is HLS which relies on the player downloading the video in sections – small files – each representing around 3 to 10 seconds of video. For HLS and similar technologies, the idea is simply to allow the player, when it’s time to download the next part of the video, to choose from a selection of files each with the same video content but each at a different bitrate.

Allowing a player to choose which chunk it downloads means it can adapt to changing network conditions but does imply that each file has contain exactly the same frames of video else there would be a jump when the next file is played. So we have met our first complication. Furthermore, each encoded stream needs to be segmented in the same way and in MPEG, where you can only cut files on I-frame boundaries, it means the encoders need to synchronise their GOP structure giving us our second complication.

These difficulties, many more and Streamroot’s solutions are presented by Erica Beavers and Nikolay Rodionov including experiments and proofs of concept they have carried out to demonstrate the efficacy.

Watch now!

Speakers

Erica Beavers Erica Beavers
Head of Marketing & Partnerships,
Streamroot
Nikolay Rodionov Nikolay Rodionov
Co-Founder, CPO
Streamroot

Video: Multicast ABR

Multicast ABR is a mix of two very beneficial technologies which are seldom seen together. ABR – Adaptive Bitrate – allows a player to change the bitrate of the video and audio that it’s playing to adapt to changing network conditions. Multicast is a network technology which efficiently sends a video stream over the network without duplicating bandwidth.

ABR has traditionally been deployed for chunk-based video like HLS where each client downloads its own copy of the video in blocks of several seconds in length. This means that you bandwidth you use to distribute your video increases by one thousand times if 1000 people play your video.

Multicast works with live streams, not chunks, but allows the bandwidth use for 1000 players to increase – in the best case – by 0%.

Here, the panelists look at the benefits of combining multicast distribution of live video with techniques to allow it to change bitrate between different quality streams.

This type of live streaming is actually backwards compatible with old-style STBs since the video sent is a live transport stream, it’s possible to deliver that to a legacy STB using a converter in the house at the same time as delivering a better, more modern delivery to other TVs and devices.

It thus also allows pure-streaming providers to compete with conventional broadcast cable providers and can also result in cost savings in equipment provided but also in bandwidth used.

There’s lots to unpack here, which is why the Streaming Video Alliance have put together this panel of experts.

Watch now and find out more!

Speakers

Phillipe Carol Phillipe Carol
Senior Product Manager,
Anevia
Neil Geary Neil Geary
Technical Strategy Consultant,
Liberty Global
Brian Stevenson Brian Stevenson
VP of Ecosystem Strategy & Partnerships,
Ericsson
Mark Fisher Mark Fisher
VP of Marketing & Business Development,
Qwilt
Jason Thibeault Jason Thibeault
Executive Director,
Streaming Video Alliance

Video: Using CMAF to Cut Costs, Simplify Workflows & Reduce Latency

There are two ways to stream video online, either pushing from the server to the device like WebRTC, MPEG transport streams and similar technologies, or allowing the receiving device to request chunks of the stream which is how the majority of internet streaming is done – using HLS and similar formats.

Chunk-based streaming is generally seen as more scalable of these two methods but suffers extra latency due to buffering several chunks each of which can represent between 1 and, typically, 10 seconds of video.

CMAF is one technology here to change that by allowing players to buffer less video. How does this achieve this? An, perhaps more important, can it really cut costs? Iraj Sodagar from NexTreams is here to explain how in this talk from Streaming Media West, 2018.

Iraj covers:

  • A brief history of CMAF (Common Media Format)
  • The core technologies (ISO BMFF, Codecs, captions etc.)
  • Media Data Object (Chunks, Fragments, Segments)
  • Different ways of video delivery
  • Switching Sets (for ABR)
  • Content Protection
  • CTA WAVE project
  • Wave content specifications
  • Live Linear Content with Wave & CMAF
  • Low-latency CMAF usage
  • HTTP 1.1 Chunked Transfer Encoding
  • MPEG DASH

Watch now!

Speaker

Iraj Sodagar Iraj Sodagar
Independant Consultant
Multimedia System Architect, NexTreams