MPEG DASH is a standardised, widely-supported protocol for networked streaming – but how can you spot problems and tell if you or another vendor have implemented it right?
This webinar, run by HbbTV – an initiative aimed at merging over-the-air broadcast with broadband delivery (which includes both file-download and streaming) – sets out to explain how you can test your DASH streaming using new tools now available. For instance, HbbTV and DVB have collaborated on a DASH validation tool which checks MPDs, segments and more to be sure that a stream is compliant with both DVB and HbbTV specifications.
Bringing together the experience of Bob Campbell from Eurofins, Waqar Zia from Nomor Research and Juha Joki from Sofia Digital, anyone who develops for, or provides services based on DASH will benefit from this webinar.
Streaming on the net relies on delivering video at a bandwidth you can handle. Called ‘Adaptive Bitrate’ or ABR, it’s hardly possible to think of streaming without it. While the idea might seem simple initially – just send several versions of your video – it quickly gets nuanced.
Streaming experts Streamroot take us through how ABR works at Streaming Media East from 2016. While the talk is a few years old, the facts are still the same so this remains a useful talk which not only introduces the topic but goes into detail on how to implement ABR.
The most common streaming format is HLS which relies on the player downloading the video in sections – small files – each representing around 3 to 10 seconds of video. For HLS and similar technologies, the idea is simply to allow the player, when it’s time to download the next part of the video, to choose from a selection of files each with the same video content but each at a different bitrate.
Allowing a player to choose which chunk it downloads means it can adapt to changing network conditions but does imply that each file has contain exactly the same frames of video else there would be a jump when the next file is played. So we have met our first complication. Furthermore, each encoded stream needs to be segmented in the same way and in MPEG, where you can only cut files on I-frame boundaries, it means the encoders need to synchronise their GOP structure giving us our second complication.
These difficulties, many more and Streamroot’s solutions are presented by Erica Beavers and Nikolay Rodionov including experiments and proofs of concept they have carried out to demonstrate the efficacy.
There are two ways to stream video online, either pushing from the server to the device like WebRTC, MPEG transport streams and similar technologies, or allowing the receiving device to request chunks of the stream which is how the majority of internet streaming is done – using HLS and similar formats.
Chunk-based streaming is generally seen as more scalable of these two methods but suffers extra latency due to buffering several chunks each of which can represent between 1 and, typically, 10 seconds of video.
CMAF is one technology here to change that by allowing players to buffer less video. How does this achieve this? An, perhaps more important, can it really cut costs? Iraj Sodagar from NexTreams is here to explain how in this talk from Streaming Media West, 2018.
A brief history of CMAF (Common Media Format)
The core technologies (ISO BMFF, Codecs, captions etc.)
Alex Zambelli from Hulu presents SCTE-35 at the Seattle Video Tech Meetup.
Alex looks at what SCTE and SCTE-35 are and introduces ad insertion. With the foundation in place, he then looks through the message structures to show the commands and descriptors possible.
Finishing off with SCTE-35 signalling in MPEG-DASH and HLS, Alex covers the topic admirably for live streaming!