Video: SMPTE Technical Primers

The Broadcast Knowledge exists to help individuals up-skill whatever your starting point. Videos like this are far too rare giving an introduction to a large number of topics. For those starting out or who need to revise a topic, this really hits the mark particularly as there are many new topics.

John Mailhot takes the lead on SMPTE 2110 explaining that it’s built on separate media (essence) flows. He covers how synchronisation is maintained and also gives an overview of the many parts of the SMPTE ST 2110 suite. He talks in more detail about the audio and metadata parts of the standard suite.

Eric Gsell discusses digital archiving and the considerations which come with deciding what formats to use. He explains colour space, the CIE model and the colour spaces we use such as 709, 2100 and P3 before turning to file formats. With the advent of HDR video and displays which can show bright video, Eric takes some time to explain why this could represent a problem for visual health as we don’t fully understand how the displays and the eye interact with this type of material. He finishes off by explaining the different ways of measuring the light output of displays and their standardisation.

Yvonne Thomas talks about the cloud starting by explaining the different between platform as a service (PaaS), infrastructure as a service (IaaS) and similar cloud terms. As cloud migrations are forecast to grow significantly, Yvonne looks at the drivers behind this and the benefits that it can bring when used in the right way. Using the cloud, Yvonne shows, can be an opportunity for improving workflows and adding more feedback and iterative refinement into your products and infrastructure.

Looking at video deployments in the cloud, Yvonne introduces video codecs AV1 and VVC both, in their own way, successors to HEVC/h.265 as well as the two transport protocols SRT and RIST which exist to reliably send video with low latency over lossy networks such as the internet. To learn more about these protocols, check out this popular talk on RIST by Merrick Ackermans and this SRT Overview.

Rounding off the primer is Linda Gedemer from Source Sound VR who introduces immersive audio, measuring sound output (SPL) from speakers and looking at the interesting problem of forward speakers in cinemas. The have long been behind the screen which has meant the screens have to be perforated to let the sound through which interferes with the sound itself. Now that cinema screens are changing to be solid screens, not completely dissimilar to large outdoor video displays, the speakers are having to move but now with them out of the line of sight, how can we keep the sound in the right place for the audience?

This video is a great summary of many of the key challenges in the industry and works well for beginners and those who just need to keep up.

Watch now!
Speakers

John Mailhot John Mailhot
Systems Architect for IP Convergence,
Imagine Communications
Eric Gsell Eric Gsell
Staff Engineer,
Dolby Laboratories
Linda Gedemer, PhD Linda Gedemer, PhD
Technical Director, VR Audio Evangelist
Source Sound VR
Yvonne Thomas Yvonne Thomas
Strategic Technologist
Digital TV Group

Video: Tech Talk: Production case studies – the gain after the pain

Technology has always been harnessed to improve, change and reinvent production. Automated cameras, LED walls, AR, LED lighting among many other technologies have all enabled productions to be done differently creating new styles and even types of programming.

In this Tech Talk from IBC 2019, we look at disruptive new technologies that change production, explained by the people who are implementing them and pushing the methods forward.

TV2 Norway’s Kjell Ove Skarsbø explains how they have developed a complete IP production flow and playout facility. This system allows them more flexibility and scalability. They did this by creating their own ESB (Enterprise Service Bus) to decouple the equipment from direct integrations. Working in an agile fashion, they delivered incremental improvements. This means that Provys, Mayam, Viz, Mediator amongst other equipment communicate with each other by delivering messages in to a system framework which passes messages on their behalf in a standard format.

Importantly, Kjell shares with us some mistakes that were made on the way. For instance, the difficulties of the size of the project, the importance of programmers understanding broadcast. “Make no compromise” is one of the lessons learnt which he discusses.

Olie Baumann from MediaKind presents live 360º video delivery, “Experiences that people have in VR embed themselves more like memories than experiences like television” he explains. Olie starts. by explaining the lay of the land in today’s VR equipment landscape then looking at some of the applications of 360º video such as looking around from an on-car camera in racing.

Olie talks us through a case study where he worked with Tiledmedia to deliver an 8K viewport which is delivered in full resolution only in the direction the 360º viewer and a lower resolution for the rest. When moving your head, the area in full resolution moves to match. We then look through the system diagram to understand which parts are in the cloud and what happens.

Matthew Brooks with Thomas Preece from BBC R&D explain their work in taking Object-based media from the research environment into mainstream production. This work allows productions to deliver object-based media meaning that the receiving device can display the objects in the best way for the display. In today’s world of second screens, screen sizes vary and small screens can benefit from larger, or less, text. It also allows for interactivity where programmes fork and can adapt to the viewers tastes, opinions and/or choices. Finally, they have delivered a tool to help productions manage this themselves and they can even make a linear version of the programme to maximise the value gained out of the time and effort spent in creating these unique productions.

Watch now!

Watch now!
Speakers

Kjell Ove Skarsbø Kjell Ove Skarsbø
Chief Technology Architect,
TV2 Norway
Olie Baumann Olie Baumann
Senior Technical Specialist,
MediaKind
Matthew Brooks Matthew Brooks
Lead Engineer,
BBC Research & Development
Thomas Preece Thomas Preece
Research Engineer,
BBC Research & Development
Stephan Heimbecher

Video: TR-1001 Replacing Video By Spreadsheet

Here to kill the idea of SDNs – Spreadsheet Defined Networks – is TR-1001 which defines ways to implement IP-based media facilities avoiding some typical mistakes and easing the support burden.

From the JT-NM (Joint Taskforce – Networked Media), TR-1001 promises to be a very useful document for companies implementing ST-2110 or any video-over-IP network Explaining what’s in it is EEG’s Bill McLaughlin at the VSF’s IP Showcase at NAB.

This isn’t the first time we’ve written about TR-1001 at The Broadcast Knowledge. Previously, Imagine’s John Mailhot has dived in deep as part of a SMPTE standards webcast. Here, Bill takes a lighter approach to get over the main aims of the document and adds details about recent testing which happened across several vendors.

Bill looks at the typical issues that people find when initially implementing a system with ST-2110 devices and summarises the ways in which TR-1001 mitigates these problems. The aim here is to enable, at least in theory, many nodes to be configured in an automatic and self-documenting way.

Bill explains that TR-1001 covers timing, discovery and connection of devices plus some of configuration and monitoring. As we would expect, ST-2110 itself defines the media transport and also some of the timing. Work is still to be done to help TR-1001 address security aspects.

Speaker

Bill McLaughlin Bill McLaughlin
VP Product Development,
EEG Enterprises

Video: Monetization with Manifest Manipulation

Manipulating the manifest of streamed video allows localisation of adverts with the option of per-client customisation. This results in better monetisation but also a better way to deal with blackouts and other regulatory or legal restrictions.

Using the fact that most streamed video is delivered by using a playlist which is simply a text file which lists the locations of the many files which contain the video, we see that you could deliver different playlists to clients in different locations – detected via geolocating the IP address. Similarly different ads can be delivered depending on the type of client requesting – phone, tablet, computer etc.

Here, Imagine’s Yuval Fisher starts by reminding us how online streaming typically works using HLS as an example. He then leads us through the possibilities of manifest manipulation. One interesting idea is using this to remove hardware delivering cost savings using the same infrastructure to deliver to both the internet and broadcast. Yuval finshes up with a list of “Dos and Don’ts” to explain the best way to achieve the playlist manipulation.

Sarah Foss rounds off the presentation explaining how manifest manipulation sits at the centre of the rest of the ad-delivery system.

Watch now!

Speaker

Yuval Fisher Yuval Fisher
CTO, Distribution
Imagine Communications.
Sarah Foss Sarah Foss
Former SVP & GM, Ad Tech,
Imagine Communications.