Manipulating the manifest of streamed video allows localisation of adverts with the option of per-client customisation. This results in better monetisation but also a better way to deal with blackouts and other regulatory or legal restrictions.
Using the fact that most streamed video is delivered by using a playlist which is simply a text file which lists the locations of the many files which contain the video, we see that you could deliver different playlists to clients in different locations – detected via geolocating the IP address. Similarly different ads can be delivered depending on the type of client requesting – phone, tablet, computer etc.
Here, Imagine’s Yuval Fisher starts by reminding us how online streaming typically works using HLS as an example. He then leads us through the possibilities of manifest manipulation. One interesting idea is using this to remove hardware delivering cost savings using the same infrastructure to deliver to both the internet and broadcast. Yuval finshes up with a list of “Dos and Don’ts” to explain the best way to achieve the playlist manipulation.
Sarah Foss rounds off the presentation explaining how manifest manipulation sits at the centre of the rest of the ad-delivery system.
There are two main modern approaches to low-latency live streaming, one is CMAF which used fragmented MP4s to allow frame by frame delivery of chunks of data. Similar to HLS, this is becoming a common ‘next step’ for companies already using HLS. Keeping the chunk size down reduces latency, but it remains doubtful if sub-second streaming is practical in real world situations.
Steve Miller Jones from Limelight explains the WebRTC solution to this problem. Being a protocol which is streamed from the source to the destination, this is capable of sub-second latency, too, and seems a better fit. Limelight differentiate themselves on offering a scalable WebRTC streaming service with Adaptive Bitrate (ABR). ABR is traditionally not available with WebRTC and Steve Miller Jones uses this as an example of where Limelight is helping this technology achieve its true potential.
Comparing and contrasting Limelight’s solution with HLS and CMAF, we can see the benefit of WebRTC and that it’s equally capable of supporting features like encryption, Geoblocking and the like.
Ultimately, the importance of latency and the scalability you require may be the biggest factor in deciding which way to go with your sub-second live streaming.
Streaming on the net relies on delivering video at a bandwidth you can handle. Called ‘Adaptive Bitrate’ or ABR, it’s hardly possible to think of streaming without it. While the idea might seem simple initially – just send several versions of your video – it quickly gets nuanced.
Streaming experts Streamroot take us through how ABR works at Streaming Media East from 2016. While the talk is a few years old, the facts are still the same so this remains a useful talk which not only introduces the topic but goes into detail on how to implement ABR.
The most common streaming format is HLS which relies on the player downloading the video in sections – small files – each representing around 3 to 10 seconds of video. For HLS and similar technologies, the idea is simply to allow the player, when it’s time to download the next part of the video, to choose from a selection of files each with the same video content but each at a different bitrate.
Allowing a player to choose which chunk it downloads means it can adapt to changing network conditions but does imply that each file has contain exactly the same frames of video else there would be a jump when the next file is played. So we have met our first complication. Furthermore, each encoded stream needs to be segmented in the same way and in MPEG, where you can only cut files on I-frame boundaries, it means the encoders need to synchronise their GOP structure giving us our second complication.
These difficulties, many more and Streamroot’s solutions are presented by Erica Beavers and Nikolay Rodionov including experiments and proofs of concept they have carried out to demonstrate the efficacy.
Date: Thursday February 28th 2019, 10am PT / 1PM ET / 18:00 GMT
Streaming continues to grow, in amount streamed, in people consuming it and in importance within this and other industries. One things which has always been an enabler yet made streaming harder to deploy is its rapid evolution. Whilst this has been a boon for smaller, nimbler companies – both content producers and service providers – the streaming has now arrived at most companies in one way or another and this breadth of use-cases has kept streaming tech moving forward and showing no signs of abatement.
Some aspects are changing. For instance we are seeing the first patent-free MPEG standard proposals (EVC, which has basic patent-free functionality and a better performing patent-controlled profile) on the heels of AV1. We’re seeing low-latency efforts such as CMAF taking hold as an alternative to WebRTC. With CMAF being much closer to the ever popular HLS, this may well beat out WebRTC in deployments at the cost of a slightly higher, but much improved latency.
To bring all of this in to focus for 2019, Jason Thibeault from the Streaming Video Alliance is bringing together a panel of experts to look at the coming trends and to give us an idea of what to look out for, and how to make sense, of 2019’s year of video delivery.
Alex Zambelli from Hulu presents SCTE-35 at the Seattle Video Tech Meetup.
Alex looks at what SCTE and SCTE-35 are and introduces ad insertion. With the foundation in place, he then looks through the message structures to show the commands and descriptors possible.
Finishing off with SCTE-35 signalling in MPEG-DASH and HLS, Alex covers the topic admirably for live streaming!
Nobody wants to find out about a big play or major news event on Twitter before they see it in their video stream, so reducing latency is crucial for OTT services’ success. Likewise, ultra-low latency is crucial for interactive streaming applications. Depending on your use case, a few seconds of latency might be fine, or you might need to try to hit that sub-second target.
Learn which technologies and solutions are best for your business, and make sure your viewers get their video on time, every time. In this webinar, you’ll learn the following:
Why it’s important to evaluate and improve latency end-to-end, including software and services, encoder, platform, and player
How to decide which technology and solution is best for your use case (e.g. CMAF, HLS/DASH, WebRTC, Websocket)
How chunked CMAF offers a standards-based approach that allows latency to be decoupled from segment duration
How chunked CMAF leverages existing CDN HTTP capacity to provide low-latency solutions at high scale
How WebRTC can be used to deliver live video sub-second latency at scale, and provide rich, interactive experiences for live streaming applications
How a single misconfigured component can undo any other effort to achieve low latency
How integrated solutions create new business opportunities for low latency interactive use cases
How to achieve low latency across all platforms and devices
VP of Product Strategy,
Moderator: Eric Schumacher-Rasmussen
Date: November 8th 2018, 10AM PST / 17:00 GMT
As the first of Wowza’s Low-Latency Streaming webinar series, join Pete McIntosh, Jamie Sherry and Mac Hill who’ll take you through the basics of streaming protocols so learn why latency isn’t always low and what techniques you can use to reduce it. They’ll also tell you how Wowza’s launched their low-latency product.
Viewers are increasingly watching live sports and other realtime events online, but the inherent delay in traditional online streaming often means they learn about an important play from social media before they are able to see it. Delivering true realtime global online streaming requires a new approach.
Video delivery experts from Limelight Networks present ways to deliver broadcast quality low-latency live streams, including the ability for viewers to watch live video with less than one second of latency on standard web browsers—without special plug-ins. Also discussed is how to integrate live video and interactive data to open up new workflows in sports, gaming, auctions, and more and make live viewing a more interactive social experience.
Join this webinar for:
• Latest market data on the evolving viewer expectations for online
• Delivering low-latency live video with HLS and DASH chunked streaming techniques
• Realtime global live streaming using WebRTC
• How integrated live bidirectional data sharing can help open up new business opportunities
• and much more! Register Now!
Sr. Manager Product Marketing, Limelight Networks
A great discussion from Streaming Media East discussing the battle to achieve Low-Latency Live Video by speakers from BAMTECH, Limelight and Red5Pro.In this session, learn about the pros and cons of various technologies on both the contribution and delivery side of lowlatency streaming, including small chunk size HLS/DASH, WebRTC, WebSockets, QUIC, SRT, and CMAF:
What does ‘Low Latency’ mean? Realtime? Are Cable & TV low-latency?
How do you synchronise OTT with Data and TV
Where is latency introduced? Which buffers have the biggest impact?
How can you fight rebuffing and which metric is the most useful?