Video: Multicast ABR opens the door to a new DVB era

Multicast ABR (mABR) is a way of delivering standard HTTP-based streams like HLS and DASH over multicast. This can be done using an ISP’s managed network to multicast to thousands of homes and only within the home itself does the stream gets converted into unicast HTTP. This allows devices in the home to access streaming services in exactly the same way as they would Netflix or iPlayer, but avoiding strain on the core network. Streaming is a point-to-point service so each device takes its own stream. If you have 3 devices in the home watching a service, you’ll be sending 3 streams out to them. With mABR, the core network only ever sees one stream to the home and the linear scaling is done internally.

Guillaume Bichot from Broadpeak explains how this would work with a multicast server that picks up the streaming files from a CDN/the internet and converts it into multicast. This then needs a gateway at the other end to convert back into multicast. The gateway can run on a set-top-box in the home, as long as multicast can be carried over the last mile to the box. Alternatively, it can be upstream at a local headend or similar.

At the beginning of the talk, we hear from BBC R&D’s Richard Bradbury who explains the current state of the work. Published as DVB Bluebook A176, this is currently written to account for live streaming, but will be extended in the future to deal with video on demand. The gateway is able to respond with a standard HTTP redirect if it becomes overloaded which seamlessly pushes the player’s request directly to the relevant CDN endpoint.

DVB also outlines how players can contact the CDN for missing data or video streams that are not provided, currently, via the gateway. Guillaume outlines which parts of the ecosystem are specified and which are not. For instance, the function of the server is explained but not how it achieves this. He then shows where all this fits into the network stack and highlights that this is protocol-agnostic as far as delivery of media. Whilst they have used DVB-DASH as their assumed target, this could as easily work with HLS or other formats.

Guillaume finishes by showing deployment examples. We see that this can work with uni-directional satellite feeds with a return channel over the internet. It can also work with multiple gateways accessible to a single consumer.

The webinar ends with questions though, during the webinar, Richard Bradbury was answering questions on the chat. DVB has provided a transcript of these questions.

Watch now!
Download the slides from this presentation
Speakers

Richard Bradbury Richard Bradbury
Lead Research Engineer,
BBC R&D
Guillaume Bichot Guillaume Bichot
Principal Engineer, Head of Exploration,
Broadpeak

Video: The Case To Caption Everything

To paraphrase a cliché, “you are free to put black and silence to air, but if you do it without captions, you’ll go to prison.” Captions are useful to the deaf, the hard of hearing as well as those who aren’t. And in many places, to not caption videos is seen as so discriminatory, there is a mandatory quota. The saying at the beginning alludes to the US federal and local laws which lay down fines for lack of compliance – though whether it’s truly possible to go to prison, is not clear.

The case for captioning:
“13.3 Million Americans watch British drama”

In many parts of the world ‘subtitles’ means the same as ‘captions’ does in countries such as the US. In this article, I shall use the word captions to match the terms used in the video. As Bill Bennett from ENCO Systems explains, Closed Captions are sent as data along with the video meaning you ask your receiver to turn off, or turn on, display of the text. 

In this talk from the Midwest Broadcast Multimedia Technology Conference, we hear not only why you should caption, but get introduced to the techniques for both creating and transmitting them. Bill starts by introducing us to stenography, the technique of typing on special machines to do real-time transcripts. This is to help explain how resource-intensive creating captions is when using humans. It’s a highly specialist skill which, alone, makes it difficult for broadcasters to deliver captions en masse.

The alternative, naturally, is to have computers doing the task. Whilst they are cheaper, they have problems understanding audio over noise and with multiple people speaking at once. The compromise which is often used, for instance by BBC Sports, is to have someone re-speaking the audio into the computer. This harnesses the best aspects of the human brain with the speed of computing. The re-speaker can annunciate and emphasise to get around idiosyncrasies in recognition.

Bill re-visits the numerous motivations to caption content. He talks about the legal reasons, particularly within the US, but also mentions the usefulness of captions for situations where you don’t want audio from TVs, such as receptions and shop windows as well as in noisy environments. But he also makes the point that once you have this data, the broadcaster can take the opportunity to use that data for search, sentiment analysis and archive retrieval among other things.

Watch now!
Download the presentation
Speaker

Bill Bennett Bill Bennett
Media Solutions Account Manager
ENCO Systems

Video: ST 2110 Testing Fundamentals

When you’ve chosen to go IP in your facility using ST 2110, you’ll need to know how to verify it’s working correctly, how to diagnose problems and have the right tools available. Vendors participate in several interop tests a year, so we can learn from how they set up their tests and the best practices they develop.

In this talk, Jean Lapierre explains what to test for and the types of things that typically go wrong in ST 2110 systems with PTP. Jean starts by talking about the parts of 2110 which are tested and the network and timing infrastructure which forms the basis of the testing. He then starts to go through problems to look for in deployments.

Jean talks about testing that IGMPv3 multicasts can be joined and then looks at checking the validity of SDP files which can be done by visual inspection and also SDPoker. A visual inspection is still important because whilst SDPoker checks the syntax, there can be basic issues in the content. 2022-7 testing is next. The simplest test is to turn one path off and check for disturbances, but this should be followed up by using a network emulator to deliver a variety of different types of errors of varying magnitudes to ensure there are no edge cases.

ST 2110 uses PTP for timing so, naturally, the timing system also needs to be tested. PTP is a bi-directional system for providing time to all parts of the network instead of a simple waterfall distribution of a centrally created time signal like black and burst. Whilst this system needs monitoring during normal operation, it’s important to check for proper grandmaster failover of your equipment.

PTP is also important when doing 2110 PCAPs in order to have accurate timing and to enable analysis with the EBU’s LIST project. Jean gives some guidelines on using and installing LIST and finishes his talk outlining some of the difficulties he has faced, providing tips on what to look out for.

Watch now!
Speakers

Jean Lapierre Jean Lapierre
Senior Director of Engineering,
Matrox

Video: Demystifying Video Delivery Protocols

Let’s face it, there are a lot of streaming protocols out there both for contribution and distribution. Internet ingest in RTMP is being displaced by RIST and SRT, whilst low-latency players such as CMAF and LL-HLS are vying for position as they try to oust HLS and DASH in existing services streaming to the viewer.

This panel, hosted by Jason Thibeault from the Streaming Video Alliance, talks about all these protocols and attempts to put each in context, both in the broadcast chain and in terms of its features. Two of the main contribution technologies are RIST and SRT which are both UDP-based protocols which implement a method of recovering lost packets whereby packets which are lost are re-requested from the sender. This results in a very high resilience to packet loss – ideal for internet deployments.

First, we hear about SRT from Maxim Sharabayko. He lists some of the 350 members of the SRT Alliance, a group of companies who are delivering SRT in their products and collaborating to ensure interoperability. Maxim explains that, based on the UDT protocol, it’s able to do live streaming for contribution as well as optimised file transfer. He also explains that it’s free for commercial use and can be found on github. SRT has been featured a number of times on The Broadcast Knowledge. For a deeper dive into SRT, have a look at videos such as this one, or the ones under the SRT tag.

Next Kieran Kunhya explains that RIST was a response to an industry request to have a vendor-neutral protocol for reliable delivery over the internet or other dedicated links. Not only does vendor-neutrality help remove reticence for users or vendors to adopt the technology, but interoperability is also a key benefit. Kieran calls out hitless switching across multiple ISPs and cellular. bonding as important features of RIST. For a summary of all of RIST’s features, read this article. For videos with a deeper dive, have a look at the RIST tag here on The Broadcast Knowledge.

Demystifying Video Delivery Protocols from Streaming Video Alliance on Vimeo.

Barry Owen represents WebRTC in this webinar, though Wowza deal with many protocols in their products. WebRTC’s big advantage is sub-second delivery which is not possible with either CMAF or LL-HLS. Whilst it’s heavily used for video conferencing, for which it was invented, there are a number of companies in the streaming space using this for delivery to the user because of it’s almost instantaneous delivery speed. Whilst a perfect rendition of the video isn’t guaranteed, unlike CMAF and LL-HLS, for auctions, gambling and interactive services, latency is always king. For contribution, Barry explains, the flexibility of being able to contribute from a browser can be enough to make this a compelling technology although it does bring with it quality/profile/codec restrictions.

Josh Pressnell and Ali C Begen talk about the protocols which are for delivery to the user. Josh explains how smoothstreaming has excited to leave the ground to DASH, CMAF and HLS. They discuss the lack of a true CENC – Common Encryption – mechanism leading to duplication of assets. Similarly, the discussion moves to the fact that many streaming services have to have duplicate assets due to target device support.

Looking ahead, the panel is buoyed by the promise of QUIC. There is concern that QUIC, the Google-invented protocol for HTTP delivery over UDP, is both under standardisation proceedings in the IETF and is also being modified by Google separately and at the same time. But the prospect of a UDP-style mode and the higher efficiency seems to instil hope across all the participants of the panel.

Watch now to hear all the details!
Speakers

Ali C. Begen Ali C. Begen
Technical Consultant, Comcast
Kieran Kunhya Kieran Kunhya
Founder & CEO, Open Broadcast Systems
Director, RIST Forum
Barry Owen Barry Owen
VP, Solutions Engineering
Wowza Media Systems
Joshua Pressnell Josh Pressnell
CTO,
Penthera Technologies
Maxim Sharabayko Maxim Sharabayko
Senior Software Developer,
Haivision
Jason Thibeault Moderator: Jason Thibeault
Executive Director,
Streaming Video Alliance