MPEG DASH is a standardised method for encapsulating media for streaming similar to Apple’s HLS. Based on TCP, MPEG DASH is a widely compatible way of streaming video and other media over the internet.
MPEG DASH is now on its 3rd edition, its first standard being in 2011. So this talk starts by explaining what’s new as of July 2019 in this edition. Furthermore, there are amendments already worked on which are soon to add more features.
Iraj Sodagar explains Service Descriptors which will be coming that allow the server to encapsulate metadata for the player which describes how the publisher intended to show the media. Maximum and minimum latency and quality is specified. for instance. The talk explains how these are used and why they are useful.
Another powerful metadata feature is the Initialization Set, Group and Presentation which gives the decoder a ‘heads up’ on what the next media will need in terms of playback. This allows the player to politely decline to play the media if it can’t display it. For instance, if a decoder doesn’t supply AV1, this can be identified before needing to attempt a decode or download a chunk.
Iraj then explains what will be in the 4th edition including the above, signalling leap seconds and much more. This should be published over the next few months.
Amendement 1 is working towards a more accurate timing model of events and defining a specific DASH profile for CMAF (the low-latency streaming technology based on DASH) which Iraj explains in detail.
Finishing off with session based DASH operations, a look over the DASH workplan/roadmap, ad insertion, event and timed metadata processing, this is a great, detailed look at the DASH of today and of 2020.
How can we overcome one of the last, big, problems in making CMAF a generally available: making ABR work properly.
ABR, Adaptive Bitrate is a technique which allows a video player to choose what bitrate video to download from a menu of several options. Typically, the highest bitrate will have the highest quality and/or resolution, with the smallest files being low resolution.
The reason a player needs to have the flexibility to choose the bitrate of the video is mainly due to changing network conditions. If someone else on your network starts watching some video, this may mean you can no longer download video quick enough to keep watching in full quality HD and you may need to switch down. If they stop, then you want your player to switch up again to make the most of the bitrate available.
Traditionally this is done fairly simply by measuring how long each chunk of the video takes to download. Simply put, if you download a file, it will come to you as quickly as it can. So measuring how long each video chunk takes to get to you gives you an idea of how much bandwidth is available; if it arrives very slowly, you know you are close to running out of bandwidth. But in low-latency streaming, your are receiving video as quickly as it is produced so it’s very hard to see any difference in download times and this breaks the ABR estimation.
He starts by explaining how players currently behave with low-latency ABR showing how they miss out on changing to higher/lower renditions. Then he looks at the differences on the server and for the player between non-low-latency and low-latency streams. This lays the foundation to discuss ACTE – ABR for Chunked Transfer Encoding.
ACTE is a method of analysing bandwidth with the assumption that some chunks will be delivered as fast as the network allows and some won’t be. The trick is detecting which chunks actually show the network speed and Ali explains how this is done and shows the results of their evaluation.
From event to event it’s not a surprise that streaming traffic increases, but this look at the Wolrd Cup 2018 shows a very sharp rise beating many expecatations. Joachim Hengge tells us what hte World Cup looked like from Akamai’s perspective.
Joachim takes us through the stats for streaming the World Cup where they peaked at 23Tbps of throuhgput with nearly 10 million concurrent viewers. The bandwidth was significantly higher than the last World Cup but looking at the data, we can learn a few more things about the market.
After looking at a macth-by-match breakdown we look at a sytsem architecture for one customer who delivered the World Cup to highlight the importance of stable content ingest, latency and broadcast quality. Encoding and packaging into HLS with 4-second chunks were tasks done on site with the rest happening within Akamai and being fed to other CDNs. Joachim pulls this together into three key recommendations for anyone looking at streaming large events before delvingin to some Sweden-specific streaming stats where over 81% of feeds were played back at the highest quality.
This talk is from Streaming Tech Sweden, an annual conference run by Eyevinn Technology. Videos from the event are available to paid attendees but are released free of charge after several months. As with all videos on The Broadcast Knowledge, this is available free of charge after registering on the site.
Senior Product Manager, Media Services,
Many companies would love to be using free codecs, unencumbered by patents, rather than paying for HEVC or AVC. Phil Cluff shows that, contrary to popular belief, it is possible stream with free codecs and get good coverage on mobile and desktop.
Phil starts off by looking at the codecs available and whether they’re patent encumbered with an eye to how much of the market can actually decode them. Free codecs and containers like WebM, VP8 etc. are not supported by Safari which reduces mobile penetration by half. To prove the point, Phil presents the results of his trials in using HEVC, AVC and VP8 on all major browsers.
Whilst this initially leaves a disappointing result for streaming with libre codecs on mobile, there is a solution! Phil explains how an idea from several years ago is being reworked to provide a free streaming protocol MPAG-SASH which avoids using DASH which is itself based on ISO BMFF which is patent encumbered. He then explains how open video players like video.js can be modified to decode libre codecs.
With these two enhancements, we finally see that coverage of up to 80% on mobile is, in principle, possible.