Everyone has a go-to program or three they use for problem solving. Here is a review of a whole swathe of diagnosis programs out there for live streaming.
There are known favourites like Wireshark, FFPlay and MediaInfo, free applications such as Eyevinn Technology’s Segment Analyser and the open source YUView. And this also covers paid programs like Elecard’s Stream Analyser and Telestream Switch.
This talk by David Hassoun CEO of RealEyes media is well worth a look because there is bound to be something there you didn’t know about – and who knows how useful that will be to you!
Most online video streaming uses HTTP to deliver the video to the player in the same way web pages are delivered to the browser. So QUIC – a replacement for HTTP – will affect us professionally and personally.
This video explains how HTTP works and takes us on the journey to seeing why QUIC (which should eventually be called HTTP/3) speeds up the process of requesting and delivering files. Simply put there are ways to reduce the number of times messages have to be passed between the player and the server which reduces overall overhead. But one big win is its move away from TCP to UDP.
Robin Marx delivers these explanations by reference to superheroes and has very clear diagrams leading to this low-level topic being pleasantly accessible and interesting.
There are plenty of examples which show easy-to-see gains website speed using QUIC over both HTTP and HTTP/2 but QUIC’s worth in the realm of live streaming is not yet clear. There are studies showing it makes streaming worse, but also ones showing it helps. Video players have a lot of logic in them and are the result of much analysis, so it wouldn’t surprise me at all to see the state of the art move forward, for players to optimise for QUIC delivery and then all tests to show an improvement with QUIC streaming.
QUIC is coming, one way or another, so find out more. Watch now!
Web Performance Researcher,
There are two main modern approaches to low-latency live streaming, one is CMAF which used fragmented MP4s to allow frame by frame delivery of chunks of data. Similar to HLS, this is becoming a common ‘next step’ for companies already using HLS. Keeping the chunk size down reduces latency, but it remains doubtful if sub-second streaming is practical in real world situations.
Steve Miller Jones from Limelight explains the WebRTC solution to this problem. Being a protocol which is streamed from the source to the destination, this is capable of sub-second latency, too, and seems a better fit. Limelight differentiate themselves on offering a scalable WebRTC streaming service with Adaptive Bitrate (ABR). ABR is traditionally not available with WebRTC and Steve Miller Jones uses this as an example of where Limelight is helping this technology achieve its true potential.
Comparing and contrasting Limelight’s solution with HLS and CMAF, we can see the benefit of WebRTC and that it’s equally capable of supporting features like encryption, Geoblocking and the like.
Ultimately, the importance of latency and the scalability you require may be the biggest factor in deciding which way to go with your sub-second live streaming.
Streaming on the net relies on delivering video at a bandwidth you can handle. Called ‘Adaptive Bitrate’ or ABR, it’s hardly possible to think of streaming without it. While the idea might seem simple initially – just send several versions of your video – it quickly gets nuanced.
Streaming experts Streamroot take us through how ABR works at Streaming Media East from 2016. While the talk is a few years old, the facts are still the same so this remains a useful talk which not only introduces the topic but goes into detail on how to implement ABR.
The most common streaming format is HLS which relies on the player downloading the video in sections – small files – each representing around 3 to 10 seconds of video. For HLS and similar technologies, the idea is simply to allow the player, when it’s time to download the next part of the video, to choose from a selection of files each with the same video content but each at a different bitrate.
Allowing a player to choose which chunk it downloads means it can adapt to changing network conditions but does imply that each file has contain exactly the same frames of video else there would be a jump when the next file is played. So we have met our first complication. Furthermore, each encoded stream needs to be segmented in the same way and in MPEG, where you can only cut files on I-frame boundaries, it means the encoders need to synchronise their GOP structure giving us our second complication.
These difficulties, many more and Streamroot’s solutions are presented by Erica Beavers and Nikolay Rodionov including experiments and proofs of concept they have carried out to demonstrate the efficacy.
San Francisco Video Tech welcomes Haluk Ucar talking about live video streaming. How do you encode multiple resolutions/bitrates efficiently on CPUs and maximise the amount of channels? Is there value in managing multiple encodes centrally? How can we manage the balance between CPU use and VQ?
Haluk discusses a toolset for Adaptive Decisions and looks at Adaptive Segment Decisions. Here he discusses the relationship between IDR frames and frequent Scene Changes.
Haluk covers a lot and finishes with a Q&A. So if you have an interest in Live Streaming, then Watch Now!
Whether or not edge computing is the next generation of cloud technology, the edge plays a vital role in the streaming video experience. The closer a video is stored to the requesting user, the faster the delivery and better the experience. But, streaming also provides a lot more opportunity for interactivity, engagement, and data collection than traditional broadcast television. That means as the edge grows in compute capacity and functionality, it could enable new and exciting use cases, such as AI, that could improve the viewer experience. In this webinar, we’ll explore the state of edge computing and how it might be leveraged in streaming video.
Streaming Video Alliance
Date: Friday, March 29th 2019
Time: 11am PT / 2pm ET / 18:00 GMT
NAB is coming around again and the betting has started on what the show will bring. Whilst we can look to last year for hints, here editors from Streaming Media come together to discuss the current trends in the industry and how they will be represented at NAB.
Some highlights of the conversation will be:
What HEVC solutions people are showing – the ongoing codec wars are captivating to most people as AV1 tries – and gradually succeeds – to break its ‘too slow’ label, whilst HEVC continues to grow acceptance with its ‘ready to deploy’ label despite the fees.
UHD production and delivery – We know that production houses prefer to capture higher resolution as it increases the value of their content and gives them more options in editing. But how far is UHD developing further down the chain. Is it just for live sports?
Live Streaming – SRT is bound to keep making waves at NAB has Haivision plans its biggest event yet discussing the many ways it’s being used. SRT delivers encrypted, reliable streams – while there are competitors, SRT continues to grow apace.
NDI – This compressed but ultra low latency codec continues to impress for live production workflows – particularly live events, though it’s not clear how much – if at all – it will make its way into top-tier broadcasters.
Much more will be on the cards, so register now for this session on Friday March 29th.
VP & Editor-in-Chief
AWS is synonymous with cloud computing and whether you use it or not, knowing how to do things in AWS reaps benefits when trying to understand or implement systems in a cloud infrastructure. Knowing what’s possible and what others are doing is really useful, so whilst I don’t usually cover heavily product-specific resources here on The Broadcast Knowledge I still believe that knowing AWS is knowing part of the industry.
Here, there are 3 consecutive webinars which cover building a live streaming channel from the fundamentals through to making it operational and ongoing monitoring and maintenance.
Session one at 3pm GMT looks at end-to-end workflows and strategies for redundancy. It looks at both contribution of video into the cloud as much as what happens when it arrives and the delivery.
Session two at 4pm GMT looks examines the more complex workflows where you spread processing/failover across multiple regions and other similar situations.
Session three is the last of the day at 5pm GMT looking at setting up end-to-end monitoring to take the guesswork out of delivering the service on an on-going basis.
With live online viewing delayed by up to 30 seconds or more compared to broadcast TV, enriching the viewing experience with online content, while ensuring that all viewers see the action at the same time, is a significant challenge. To provide viewers with engaging online experiences that keep them coming back for more, service providers need true real-time streaming.
This webinar will cover questions such as:
How important is latency for live online streaming?
Which live streaming workflows offers the greatest opportunity to generate additional revenue?
What are the main challenges faced by online video service providers when live-streaming major events such as sports tournaments?
Being a webinar from Limelight, you will also hear
How Limelight realtime streaming minimizes latency
How to reach the widest audience with native browser support
How to enable new business models with interactivity
How to reach viewers everywhere
All this along with key findings from DTVE’s industry survey, showing that industry executives believe live streaming could ultimately supplant broadcast technology, but challenges remain.
Vice President of Product Strategy,
Ed Silvester heads up video R&D at Perform Group, since rebranded to DAZN (pronounced ‘dah zone’) so he’s just the man to talk us through the business aspects of encoding. Anchoring the conversation in the times that black and white TV changed to colour, Ed looks at the challenges DAZN have in creating an innovative platform with backwards compatability.
Ed considers whether the industry should DIET, shedding some older technologies (watch the talk to find out what DIET stands for). And raises some questions about how the industry should deal with platforms ending, scaling and compatibility.