MPEG DASH is a standardised, widely-supported protocol for networked streaming – but how can you spot problems and tell if you or another vendor have implemented it right?
This webinar, run by HbbTV – an initiative aimed at merging over-the-air broadcast with broadband delivery (which includes both file-download and streaming) – sets out to explain how you can test your DASH streaming using new tools now available. For instance, HbbTV and DVB have collaborated on a DASH validation tool which checks MPDs, segments and more to be sure that a stream is compliant with both DVB and HbbTV specifications.
Bringing together the experience of Bob Campbell from Eurofins, Waqar Zia from Nomor Research and Juha Joki from Sofia Digital, anyone who develops for, or provides services based on DASH will benefit from this webinar.
Streaming on the net relies on delivering video at a bandwidth you can handle. Called ‘Adaptive Bitrate’ or ABR, it’s hardly possible to think of streaming without it. While the idea might seem simple initially – just send several versions of your video – it quickly gets nuanced.
Streaming experts Streamroot take us through how ABR works at Streaming Media East from 2016. While the talk is a few years old, the facts are still the same so this remains a useful talk which not only introduces the topic but goes into detail on how to implement ABR.
The most common streaming format is HLS which relies on the player downloading the video in sections – small files – each representing around 3 to 10 seconds of video. For HLS and similar technologies, the idea is simply to allow the player, when it’s time to download the next part of the video, to choose from a selection of files each with the same video content but each at a different bitrate.
Allowing a player to choose which chunk it downloads means it can adapt to changing network conditions but does imply that each file has contain exactly the same frames of video else there would be a jump when the next file is played. So we have met our first complication. Furthermore, each encoded stream needs to be segmented in the same way and in MPEG, where you can only cut files on I-frame boundaries, it means the encoders need to synchronise their GOP structure giving us our second complication.
These difficulties, many more and Streamroot’s solutions are presented by Erica Beavers and Nikolay Rodionov including experiments and proofs of concept they have carried out to demonstrate the efficacy.
There are two ways to stream video online, either pushing from the server to the device like WebRTC, MPEG transport streams and similar technologies, or allowing the receiving device to request chunks of the stream which is how the majority of internet streaming is done – using HLS and similar formats.
Chunk-based streaming is generally seen as more scalable of these two methods but suffers extra latency due to buffering several chunks each of which can represent between 1 and, typically, 10 seconds of video.
CMAF is one technology here to change that by allowing players to buffer less video. How does this achieve this? An, perhaps more important, can it really cut costs? Iraj Sodagar from NexTreams is here to explain how in this talk from Streaming Media West, 2018.
A brief history of CMAF (Common Media Format)
The core technologies (ISO BMFF, Codecs, captions etc.)
Alex Zambelli from Hulu presents SCTE-35 at the Seattle Video Tech Meetup.
Alex looks at what SCTE and SCTE-35 are and introduces ad insertion. With the foundation in place, he then looks through the message structures to show the commands and descriptors possible.
Finishing off with SCTE-35 signalling in MPEG-DASH and HLS, Alex covers the topic admirably for live streaming!
Nobody wants to find out about a big play or major news event on Twitter before they see it in their video stream, so reducing latency is crucial for OTT services’ success. Likewise, ultra-low latency is crucial for interactive streaming applications. Depending on your use case, a few seconds of latency might be fine, or you might need to try to hit that sub-second target.
Learn which technologies and solutions are best for your business, and make sure your viewers get their video on time, every time. In this webinar, you’ll learn the following:
Why it’s important to evaluate and improve latency end-to-end, including software and services, encoder, platform, and player
How to decide which technology and solution is best for your use case (e.g. CMAF, HLS/DASH, WebRTC, Websocket)
How chunked CMAF offers a standards-based approach that allows latency to be decoupled from segment duration
How chunked CMAF leverages existing CDN HTTP capacity to provide low-latency solutions at high scale
How WebRTC can be used to deliver live video sub-second latency at scale, and provide rich, interactive experiences for live streaming applications
How a single misconfigured component can undo any other effort to achieve low latency
How integrated solutions create new business opportunities for low latency interactive use cases
How to achieve low latency across all platforms and devices
VP of Product Strategy,
Moderator: Eric Schumacher-Rasmussen
From VideoLAN’s Video Dev Days 2018, Romain Bouqueau from GPAC and FireKast discusses MPEG DASH and how this helps us with low-latency streaming. Romain starts by looking at latency in Adaptive Streaming Workflow and then covers 4 false assumptions in streaming. We are walked through all the different points that latency can occur and we see how to reduce each before looking at the drawbacks.
Viewers are increasingly watching live sports and other realtime events online, but the inherent delay in traditional online streaming often means they learn about an important play from social media before they are able to see it. Delivering true realtime global online streaming requires a new approach.
Video delivery experts from Limelight Networks present ways to deliver broadcast quality low-latency live streams, including the ability for viewers to watch live video with less than one second of latency on standard web browsers—without special plug-ins. Also discussed is how to integrate live video and interactive data to open up new workflows in sports, gaming, auctions, and more and make live viewing a more interactive social experience.
Join this webinar for:
• Latest market data on the evolving viewer expectations for online
• Delivering low-latency live video with HLS and DASH chunked streaming techniques
• Realtime global live streaming using WebRTC
• How integrated live bidirectional data sharing can help open up new business opportunities
• and much more! Register Now!
Sr. Manager Product Marketing, Limelight Networks
A great discussion from Streaming Media East discussing the battle to achieve Low-Latency Live Video by speakers from BAMTECH, Limelight and Red5Pro.In this session, learn about the pros and cons of various technologies on both the contribution and delivery side of lowlatency streaming, including small chunk size HLS/DASH, WebRTC, WebSockets, QUIC, SRT, and CMAF:
What does ‘Low Latency’ mean? Realtime? Are Cable & TV low-latency?
How do you synchronise OTT with Data and TV
Where is latency introduced? Which buffers have the biggest impact?
How can you fight rebuffing and which metric is the most useful?
Date: 13th June 2018, 14:00 BST
DVB-DASH, for the delivery of TV content via HTTP adaptive streaming, provides a profile of features defined in the MPEG DASH specification. The latest revision of DVB-DASH, published by ETSI in March 2018, adds features related to UHD.
This webinar will have three sections:
General introduction to DVB-DASH (TS 103 285 1.2.1)
DVB-DASH player conformance points (TS 101 154 2.4.1)
Deployments and use cases
Those following the webinar live will have an opportunity to post questions to the presenters.
Simon Waller, Chief Standards Engineer at Samsung Electronics Research Institute UK
Chris Poole, Lead Research Engineer at BBC R&D
Martin Schmalohr, Researcher at IRT