Video: Transporting ST 2110 Over WAN

Is SMPTE ST 2110 suitable for inter-site connectivity over the WAN? As ST 2110 continues to mature and the first facilities are going live bringing 2110 into daily use, there are a number of challenges still to be overcome and moving a large number of essence flows long distances and between PTP time domains is one of them.

Nevion’s Andy Rayner presents the work the VSF is doing to recommend transport of ST 2110 over WAN outlining where they have got to and what has been recommended to date.

The talk starts with SMPTE 2022-7 seamless protection which is recommended for dealing with path breaks. For compensating for transmission errors, FEC is recommended and Andy explains the parameters needed.

Key to the inter-site transport is trunking whereby the individual essences are mixed down to one flow. This has a number of advantages: Reducing the number of flows makes life simpler for service providers, all essences will now share the same signal path from site to site and it FEC protection can be more efficiently applied.

The trunks are made using GRE – Generic Routing Encapsulation – which is a pre-existing IT standard for grouping lots of traffic into a single tunnel whilst preserving the data inside. This then appears at the other end of the trunk with the same IP information as if nothing had happened. Andy looks at the extra encapsulation headers needed to make this work and goes on to discuss payload lengths as we need to keep them short so as not to result in fragmented packets.

Timing, as ever, is important meaning that the recommendation is to align all essences before sending them in to the trunk, though Andy looks at alternatives. Also of key concern is compression as there will be times when uncompressed video is simply too high a bandwidth to be carried on the WAN. JPEG 2000 and, now, JPEG XS are available for this task.

Andy covers timing, discovery, control, security and conversion to and from 2022-6 before finishing the talk by taking questions.

Watch now!
Speaker

Andy Rayner Andy Rayner
Chief Technologist,
Nevion

Video: Simplifying OTT Video Delivery With SCTE 224

Life used to be simple; you’d fire up your camera, point it at a presenter and it would be fed to the transmitter network. When on-going funding came into play, we wanted each transmitter to be able to show local ads and so, after many years, SCTE-35 was born to do exactly that. In today’s world, however, simply telling a transmitter when to switch doesn’t cut it. To deliver the complex workflows that both linear and OTT delivery demand, SCTE 224 has arrived on the scene which provides very comprehensive scheduling and switching.

Jean Macher, from Harmonic explains this need for SCTE 224 and what it delivers. For instance, a lot of SCTE 224 is devoted to controlling the US-style blackouts where viewers close to a sports game can’t watch the game live. Whilst this is relatively easy to deal within the US for local terrestrial transmitters, in OTT, this is a new ability. Traditionally, geo-location of IP addresses is needed for this to work where each IP address is registered against a provider. If this provider is Chinese, then at the very least, you should be able to say that this IP address is in China. However, for ISPs who have an interest in the programming, they can bring in to effect their own data in order to have very accurate geolocation data.

SCTE 224, however, isn’t just able blackouts. It also transmits accurate, multi-level, schedule information which helps to schedule complex ad breaks providing detailed, frame-accurate, local ad insertion.

It shouldn’t be thought that SCTE 35 and SCTE 224 are mutually exclusive. SCTE 35 can provide very accurate updates to unscheduled programmes and delays, where the 224 information still carries the rich metadata.

To finish up the talk, Jean looks at a specific example of the implementation and how SCTE 224 has been updated in recent years.

Watch now!

Speakers

Jean Macher Jean Macher
Director, Market Development – Broadcast
Harmonic Inc.

Video: Streaming Live Events: When it must be alright on the night

Live Streaming is an important part of not only online viewing, but increasingly of broadcast in general. It’s well documented that live programming is key to keeping linear broadcast’s tradition of ‘everyone watching at once’ which has been diluted – for both pros and cons – by non-linear viewing in recent years.

This panel, as part of IBC’s Content Everywhere, looks at the drivers behind live streaming, how it’s evolving and its future. Bringing together ultra-low-latency platform nanocosmos with managed service provider M2A Media and video player specialists Visual On, Editor of The Broadcast Knowledge, Russell Trafford-Jones starts the conversation asking what gamification is and how this plays in to live streaming.

nanocosmos’s Oliver Lietz explains how gamification is an increasing trend in terms of not only monetising existing content but is a genre in and of itself providing content which is either entirely a game or has a significant interactive element. With such services, it’s clear that latency needs to be almost zero so his company’s ability to deliver one-second latency is why he has experience in these projects.

We hear also from VisualOn’s Michael Jones who explains the low-latency service they were involved in delivering. Here, low-latency CMAF was used in conjunction with local synced-screen technology to ensure that not only was latency low, but second screen devices were not showing video any earlier/later than the main screen. The panel then discussed the importance of latency compared to synchronised viewing and where ultra-low latency was unnecessary.

Valentijn Siebrands from M2A talks about the ability to use live streaming and production in the cloud to deliver lower-cost sports events but also deliver new types of programming. Valentijn then takes us into the topic of analytics, underlining the importance of streaming analytics which reveal the health of your platform/infrastructure as much as the analytics which are most usually talked about; those which tell you the quality of experience your viewers are having and their activities on your app.

The talk concludes with a look to the future, talking about the key evolving technologies of the moment and how they will help us move forward between now and IBC’s Content Everywhere Hub in 2021.

Watch now!

Speakers

Oliver Lietz Oliver Lietz
CEO & Founder,
nanocosmos
Michael Jones Michael Jones
Former SVP and Head of Business Development,
VisualOn Inc
Valentijn Siebrands Valentijn Siebrands
Solutions Architect,
M2A Media
Russell Trafford-Jones Russell Trafford-Jones – Moderator
Manager, Support & Services – Techex
Executive Member – IET Media Technical Network

Video: ST 2110-30 and NMOS IS-08 — Audio Transport and Routing

Andreas Hildebrand starts by introducing 2110 and how it works in terms of sending the essences separately using multicast IP. This talk focusses on the ability of audio-only devices to subscribe to the audio streams without needing the video streams. Andreas then goes on to introduce AES67 which is a standard defining interoperability for audio defining timing, session description, encoding, QOS, transport and much more. Of all the things which are defined in AES67, discovery was deliberately not included and Andreas explains why.

Within SMPTE 2110, there are constraints added to AES67 under the sub-standard 2110-30. The different categories A, B and C (and their X counterparts) are explained in terms how how many audios are defined and the sample lengths with their implications detailed.

As for discovery and other aspects of creating a working system, Andreas looks towards AMWA’s NMOS suite summarising the specifications for Discovery & Registration, Connection Management, Network Control, Event & Tally, Audio Channel Mapping. It’s the latter which is the focus of the last part of this talk.

IS-08 defines a way of defining input and output blocks allowing a channel mapping to be defined. Using IS-05, we can determine which source stream should connect to which destination device. Then IS-08 gives the capability to determine which of the audios within this stream can be mapped to the output(s) of the receiving device and on top of this allows mapping from multiple received streams into the output(s) of one device. The talk then finishes with a deeper look at this process including where example code can be found.

Watch now!

Speaker

Andreas Hildebrand Andreas Hildebrand
Senior Product Manager,
ALC NetworX