Video: Tech Talk: Production case studies – the gain after the pain

Technology has always been harnessed to improve, change and reinvent production. Automated cameras, LED walls, AR, LED lighting among many other technologies have all enabled productions to be done differently creating new styles and even types of programming.

In this Tech Talk from IBC 2019, we look at disruptive new technologies that change production, explained by the people who are implementing them and pushing the methods forward.

TV2 Norway’s Kjell Ove Skarsbø explains how they have developed a complete IP production flow and playout facility. This system allows them more flexibility and scalability. They did this by creating their own ESB (Enterprise Service Bus) to decouple the equipment from direct integrations. Working in an agile fashion, they delivered incremental improvements. This means that Provys, Mayam, Viz, Mediator amongst other equipment communicate with each other by delivering messages in to a system framework which passes messages on their behalf in a standard format.

Importantly, Kjell shares with us some mistakes that were made on the way. For instance, the difficulties of the size of the project, the importance of programmers understanding broadcast. “Make no compromise” is one of the lessons learnt which he discusses.

Olie Baumann from MediaKind presents live 360º video delivery, “Experiences that people have in VR embed themselves more like memories than experiences like television” he explains. Olie starts. by explaining the lay of the land in today’s VR equipment landscape then looking at some of the applications of 360º video such as looking around from an on-car camera in racing.

Olie talks us through a case study where he worked with Tiledmedia to deliver an 8K viewport which is delivered in full resolution only in the direction the 360º viewer and a lower resolution for the rest. When moving your head, the area in full resolution moves to match. We then look through the system diagram to understand which parts are in the cloud and what happens.

Matthew Brooks with Thomas Preece from BBC R&D explain their work in taking Object-based media from the research environment into mainstream production. This work allows productions to deliver object-based media meaning that the receiving device can display the objects in the best way for the display. In today’s world of second screens, screen sizes vary and small screens can benefit from larger, or less, text. It also allows for interactivity where programmes fork and can adapt to the viewers tastes, opinions and/or choices. Finally, they have delivered a tool to help productions manage this themselves and they can even make a linear version of the programme to maximise the value gained out of the time and effort spent in creating these unique productions.

Watch now!

Watch now!
Speakers

Kjell Ove Skarsbø Kjell Ove Skarsbø
Chief Technology Architect,
TV2 Norway
Olie Baumann Olie Baumann
Senior Technical Specialist,
MediaKind
Matthew Brooks Matthew Brooks
Lead Engineer,
BBC Research & Development
Thomas Preece Thomas Preece
Research Engineer,
BBC Research & Development
Stephan Heimbecher

Video: TR-1001 Replacing Video By Spreadsheet

Here to kill the idea of SDNs – Spreadsheet Defined Networks – is TR-1001 which defines ways to implement IP-based media facilities avoiding some typical mistakes and easing the support burden.

From the JT-NM (Joint Taskforce – Networked Media), TR-1001 promises to be a very useful document for companies implementing ST-2110 or any video-over-IP network Explaining what’s in it is EEG’s Bill McLaughlin at the VSF’s IP Showcase at NAB.

This isn’t the first time we’ve written about TR-1001 at The Broadcast Knowledge. Previously, Imagine’s John Mailhot has dived in deep as part of a SMPTE standards webcast. Here, Bill takes a lighter approach to get over the main aims of the document and adds details about recent testing which happened across several vendors.

Bill looks at the typical issues that people find when initially implementing a system with ST-2110 devices and summarises the ways in which TR-1001 mitigates these problems. The aim here is to enable, at least in theory, many nodes to be configured in an automatic and self-documenting way.

Bill explains that TR-1001 covers timing, discovery and connection of devices plus some of configuration and monitoring. As we would expect, ST-2110 itself defines the media transport and also some of the timing. Work is still to be done to help TR-1001 address security aspects.

Speaker

Bill McLaughlin Bill McLaughlin
VP Product Development,
EEG Enterprises

Video: Monetization with Manifest Manipulation

Manipulating the manifest of streamed video allows localisation of adverts with the option of per-client customisation. This results in better monetisation but also a better way to deal with blackouts and other regulatory or legal restrictions.

Using the fact that most streamed video is delivered by using a playlist which is simply a text file which lists the locations of the many files which contain the video, we see that you could deliver different playlists to clients in different locations – detected via geolocating the IP address. Similarly different ads can be delivered depending on the type of client requesting – phone, tablet, computer etc.

Here, Imagine’s Yuval Fisher starts by reminding us how online streaming typically works using HLS as an example. He then leads us through the possibilities of manifest manipulation. One interesting idea is using this to remove hardware delivering cost savings using the same infrastructure to deliver to both the internet and broadcast. Yuval finshes up with a list of “Dos and Don’ts” to explain the best way to achieve the playlist manipulation.

Sarah Foss rounds off the presentation explaining how manifest manipulation sits at the centre of the rest of the ad-delivery system.

Watch now!

Speaker

Yuval Fisher Yuval Fisher
CTO, Distribution
Imagine Communications.
Sarah Foss Sarah Foss
Former SVP & GM, Ad Tech,
Imagine Communications.

Video: Power Talks – ATSC 3.0

ATSC 3.0 is the major next step in broadcasting for the US, South Korea and other countries and is a major update to the ATSC standard in so many way that getting across it all is not trivial. All terrestrial broadcasting in the US are done with ATSC as opposed to many other places, including Europe, which use DVB.

ATSC 3.0 brings in OFDM modulation which is a tried and tested technology also used in DVB. But the biggest change in the standard is that all of the transport within ATSC is IP. Broadcasters now, using broadband as a return path, have two-way communication with their viewers allowing transfer of data as well as media.

In this talk from Imagine Communications, we talk a look into the standard which, as is common nowadays, is a suite of standards. These standards cover Early Alerts, immersive audio, DRM, return paths and more. We then have a look at the system architecture of the ATSC 3.0 broadcast deployed in Phoenix.

South Korea has been pushing forward ATSC 3.0 and Chet Dagit looks at what they have been doing and how they’ve created high quality UHD channels to the consumer. He then looks at what the US can learn from this work but also DVB deployments in Europe.

Finally, Yuval Fisher looks at how the data and granularity available in ATSC 3.0 allows for more targeted ads and how you would manage both internally and harnessing it for ad campaigns.

Watch now!

Speakers

Steve Reynolds Steve Reynolds
President, P & N Solutions,
Imagine Communications
Mark Corl Mark Corl
SVP of Emergent Technology,
Triveni Digital
Chet Dagit Chet Dagit
Founder & Managing Member
RTP Holdings-Lokita Solutions
Yuval Fisher Yuval Fisher
CTO, Distribution
Imagine Communications