Live Streaming is an important part of not only online viewing, but increasingly of broadcast in general. It’s well documented that live programming is key to keeping linear broadcast’s tradition of ‘everyone watching at once’ which has been diluted – for both pros and cons – by non-linear viewing in recent years.
This panel, as part of IBC’s Content Everywhere, looks at the drivers behind live streaming, how it’s evolving and its future. Bringing together ultra-low-latency platform noncosmos with managed service provider M2A Media and Visual On who make a video player, Editor of The Broadcast Knowledge, Russell Trafford-Jones starts the conversation asking what gamification and how this plays in to live streaming.
nanocosmos’s Oliver Lietz explains how gamification is an increasing trend in terms of not only monetising existing content but is a genre in of itself providing content which is either entirely a game or has a significant interactive element. With such services, it’s clear that latency needs to be almost zero so his company’s ability to deliver one second latency is why he has experience in these projects.
We hear also from VisualOn’s Michael Jones who explains the low-latency service they were involved in delivering. Here, low-latency CMAF was used in conjunction with local synced-screen technology to ensure that not only was latency low, but second screen devices were not showing video any earlier/later than the main screen. The panel then discussed the importance of latency compared to synchronised viewing and where ultra-low latency was unnecessary.
Valentijn Siebrands from M2A talks about the ability to use live streaming and production in the cloud to deliver lower cost sports events but also deliver new types of programming. Valentijn then takes us into the topic of analytics, underlining the importance of streaming analytics which reveal the health of your platform/infrastructure as much as the analytics which are most usually talked about; those which tell you the quality of experience your viewers are having and their activities on your app.
The talk concludes with a look to the future, talking about the key evolving technologies of the moment and how they will help us move forward between now and IBC’s Content Everywhere Hub in 2021.
Andreas Hildebrand starts by introducing 2110 and how it works in terms of sending the essences separately using multicast IP. This talk focusses on the ability of audio-only devices to subscribe to the audio streams without needing the video streams. Andreas then goes on to introduce AES67 which is a standard defining interoperability for audio defining timing, session description, encoding, QOS, transport and much more. Of all the things which are defined in AES67, discovery was deliberately not included and Andreas explains why.
Within SMPTE 2110, there are constraints added to AES67 under the sub-standard 2110-30. The different categories A, B and C (and their X counterparts) are explained in terms how how many audios are defined and the sample lengths with their implications detailed.
As for discovery and other aspects of creating a working system, Andreas looks towards AMWA’s NMOS suite summarising the specifications for Discovery & Registration, Connection Management, Network Control, Event & Tally, Audio Channel Mapping. It’s the latter which is the focus of the last part of this talk.
IS-08 defines a way of defining input and output blocks allowing a channel mapping to be defined. Using IS-05, we can determine which source stream should connect to which destination device. Then IS-08 gives the capability to determine which of the audios within this stream can be mapped to the output(s) of the receiving device and on top of this allows mapping from multiple received streams into the output(s) of one device. The talk then finishes with a deeper look at this process including where example code can be found.
Super Bowl 53 has come and gone with another victory for the New England Patriots. CBS Interactive responsible for streaming of this event built a new system to deal with all the online viewers. Previously they used one vendor for acquisition and encoding and another vendor for origin storage, service delivery and security. This time the encoders were located in CBS Broadcast Centre in New York and all other systems moved to AWS cloud. Such approach gave CBS full control over the streams.
Due to a very high volume of traffic (between 30 and 35 terabits) four different CDN vendors had to be engaged. A cloud storage service optimized for live streaming video not only provided performance, consistency, and low latency, but also allowed to manage multi-CDN delivery in effective way.
In this video Krystal presents a step-by-step approach to creating a hybrid cloud/on premise infrastructure for the Super Bowl, including ad insertion, Multi-CDN delivery, monitoring and operational visibility. She emphasizes importance of scaling infrastructure to meet audience demands, taking ownership of end to end workflow, performing rigorous testing and handling communication across multiple teams and vendors.
SCTE-35 has been used for a long time in TV to signal ad break insertions and other events and in recent years has been evolved into SCTE-104 and SCTE-224. But how can SCTE-35 be used in live OTT and what are the applications?
The talk starts with a look at what SCTE is and what SCTE-35 does – namely digital program insertion. Then the talk moves on to discuss the most well-known, and the original, use case of local ad insertion. This use case is due to the fact that ads are sold nationally and locally so whereas the national ads can be played from the playout centre, the local ads need to be inserted closer to the local transmitter.
Alex Zambelli, Principal Product Manager at Hulu, then explains the message format in SCTE along with the commands and descriptors giving us an idea of what type of information can be sent and how it might be structured. Looking then to applying this to OTT, Alex continues to look at SCTE-224 which defines how to signal SCTE-35 in DASH.
For those who still use HLS rather than DASH, Alex looks at a couple of different ways of using this with Apple, perhaps unsurprisingly, preferring a method different from the one recommended by SCTE.
The talk finishes with a discussion of the challenges of using SCTE in OTT applications. See the slides