Webinar: DVB-I service discovery and programme metadata

This webinar is now available on demand!

DVB-I is an initiative to develop technical standards for television to be delivered over IP, whether over-the-top or over the internet. DVB-I works with DVB-T (terrestrial), DVB-S (satellite) and DVB-C (cable) broadcast standards so accessing services feels the same whichever delivery channel is used.

DVB-I makes the best use of the different capabilities of each channel:
– People who don’t have broadcast television can still receive services
– Devices that don’t include DVB tuners can still receive services
– New services are possible which wouldn’t be possible on conventional broadcast platforms

There are many separate ways of achieving a hybrid of OTT-delivered and broadcast-delivered content, but they are not necessarily interoperable. DVB aims to solve the interoperability issue, along with the problem of service discovery with DVB-I. As the internet is global, also DVB-I will allow global distribution of programming, whilst still honouring licensing agreements and regulatory requirements.

This webinar from DVB will cover what DVB-I is, the key use cases, it’s current status and the future timeline. The webinar will also look at service discovery, service lists and end by discussing programme metadata.

You can look at the current approved DVB-I standard here.

Watch now!

Speakers

Peter Lanigan Peter Lanigan
Chair of the Commercial Module subgroup CM-I,
DVB
Paul Higgs Paul Higgs
Co-chair of the Technical Module subgroup TM-IPI and leader of the DVB-I Task Force
DVB

Video: Synchronize your Watches: Cross-platform stream synchronization of HLS and DASH

Watching broadcast TV and also video on an online device can give people more choice, but it can also lead to hearing sports scores on one device before the other. In multi-person, multi-device homes, it can be better simply to synchronise the playback of all devices. This technique, though, has an often overlooked side effect; the ability for a group of people to watch the same content in sync.

Synchronising playback between many different people can be done for live and for on-demand content. ‘Watch Parties’ is the term Seth Madison from Philo makes in this talk for a two or more people watching the same programme – often calling each other up on FaceTime or
similar.

Seth takes us through the things to consider when designing such a system. For instance, how do you make it scale? How do you deal with one person who has a much worse connection than everyone else. How does one person pausing a video affect everyone else? These questions and more are all answered in this talk from Demuxed.

Watch now!
Speaker

Seth Madison Seth Madison
Software Engineer,
Philo

Video: SMPTE Technical Primers

The Broadcast Knowledge exists to help individuals up-skill whatever your starting point. Videos like this are far too rare giving an introduction to a large number of topics. For those starting out or who need to revise a topic, this really hits the mark particularly as there are many new topics.

John Mailhot takes the lead on SMPTE 2110 explaining that it’s built on separate media (essence) flows. He covers how synchronisation is maintained and also gives an overview of the many parts of the SMPTE ST 2110 suite. He talks in more detail about the audio and metadata parts of the standard suite.

Eric Gsell discusses digital archiving and the considerations which come with deciding what formats to use. He explains colour space, the CIE model and the colour spaces we use such as 709, 2100 and P3 before turning to file formats. With the advent of HDR video and displays which can show bright video, Eric takes some time to explain why this could represent a problem for visual health as we don’t fully understand how the displays and the eye interact with this type of material. He finishes off by explaining the different ways of measuring the light output of displays and their standardisation.

Yvonne Thomas talks about the cloud starting by explaining the different between platform as a service (PaaS), infrastructure as a service (IaaS) and similar cloud terms. As cloud migrations are forecast to grow significantly, Yvonne looks at the drivers behind this and the benefits that it can bring when used in the right way. Using the cloud, Yvonne shows, can be an opportunity for improving workflows and adding more feedback and iterative refinement into your products and infrastructure.

Looking at video deployments in the cloud, Yvonne introduces video codecs AV1 and VVC both, in their own way, successors to HEVC/h.265 as well as the two transport protocols SRT and RIST which exist to reliably send video with low latency over lossy networks such as the internet. To learn more about these protocols, check out this popular talk on RIST by Merrick Ackermans and this SRT Overview.

Rounding off the primer is Linda Gedemer from Source Sound VR who introduces immersive audio, measuring sound output (SPL) from speakers and looking at the interesting problem of forward speakers in cinemas. The have long been behind the screen which has meant the screens have to be perforated to let the sound through which interferes with the sound itself. Now that cinema screens are changing to be solid screens, not completely dissimilar to large outdoor video displays, the speakers are having to move but now with them out of the line of sight, how can we keep the sound in the right place for the audience?

This video is a great summary of many of the key challenges in the industry and works well for beginners and those who just need to keep up.

Watch now!
Speakers

John Mailhot John Mailhot
Systems Architect for IP Convergence,
Imagine Communications
Eric Gsell Eric Gsell
Staff Engineer,
Dolby Laboratories
Linda Gedemer, PhD Linda Gedemer, PhD
Technical Director, VR Audio Evangelist
Source Sound VR
Yvonne Thomas Yvonne Thomas
Strategic Technologist
Digital TV Group

Video: JPEG XS in Action for IP Production

JPEG XS is a new intra-frame compression standard delivering JPEG 2000 quality with 1000x lower latency – microseconds instead of milliseconds. This codec provides relatively low bandwidth (visually lossless compression at ratio of 10:1) with very-low and fixed latency, which makes it ideal for remote production of live events.

In this video Andy Rayner from Nevion shows how JPEG XS fits in all-IP broadcast technology with SMPTE ST 2110-22 standard. Then he presents the world’s first full JPEG-XS deployment for live IP production created for a large sports broadcaster. It was designed for pan-European WAN operation and based on ST 2110 standard with ST 2022-7 protection.

Andy discusses challenges of IP to IP processing (ST 2110-20 to ST 2110-22 conversion) and shows how to keep video and audio in sync through the whole processing chain.

This presentation proves that JPEG-XS is working, low latency distributed production is possible and the value of the ST2110-22 addition to the 2110 suite.

You can see the slides here.

Watch now!

Speaker

Andy Rayner Andy Rayner
Chief Technologist
Nevion Ltd.