Video: Tech Talk: Production case studies – the gain after the pain

Technology has always been harnessed to improve, change and reinvent production. Automated cameras, LED walls, AR, LED lighting among many other technologies have all enabled productions to be done differently creating new styles and even types of programming.

In this Tech Talk from IBC 2019, we look at disruptive new technologies that change production, explained by the people who are implementing them and pushing the methods forward.

TV2 Norway’s Kjell Ove Skarsbø explains how they have developed a complete IP production flow and playout facility. This system allows them more flexibility and scalability. They did this by creating their own ESB (Enterprise Service Bus) to decouple the equipment from direct integrations. Working in an agile fashion, they delivered incremental improvements. This means that Provys, Mayam, Viz, Mediator amongst other equipment communicate with each other by delivering messages in to a system framework which passes messages on their behalf in a standard format.

Importantly, Kjell shares with us some mistakes that were made on the way. For instance, the difficulties of the size of the project, the importance of programmers understanding broadcast. “Make no compromise” is one of the lessons learnt which he discusses.

Olie Baumann from MediaKind presents live 360º video delivery, “Experiences that people have in VR embed themselves more like memories than experiences like television” he explains. Olie starts. by explaining the lay of the land in today’s VR equipment landscape then looking at some of the applications of 360º video such as looking around from an on-car camera in racing.

Olie talks us through a case study where he worked with Tiledmedia to deliver an 8K viewport which is delivered in full resolution only in the direction the 360º viewer and a lower resolution for the rest. When moving your head, the area in full resolution moves to match. We then look through the system diagram to understand which parts are in the cloud and what happens.

Matthew Brooks with Thomas Preece from BBC R&D explain their work in taking Object-based media from the research environment into mainstream production. This work allows productions to deliver object-based media meaning that the receiving device can display the objects in the best way for the display. In today’s world of second screens, screen sizes vary and small screens can benefit from larger, or less, text. It also allows for interactivity where programmes fork and can adapt to the viewers tastes, opinions and/or choices. Finally, they have delivered a tool to help productions manage this themselves and they can even make a linear version of the programme to maximise the value gained out of the time and effort spent in creating these unique productions.

Watch now!

Watch now!
Speakers

Kjell Ove Skarsbø Kjell Ove Skarsbø
Chief Technology Architect,
TV2 Norway
Olie Baumann Olie Baumann
Senior Technical Specialist,
MediaKind
Matthew Brooks Matthew Brooks
Lead Engineer,
BBC Research & Development
Thomas Preece Thomas Preece
Research Engineer,
BBC Research & Development
Stephan Heimbecher

Video: Real World IP – PTP

PTP, Precision Time Protocol, underpins the recent uncompressed video and audio over IP standards. It takes over the role of facility-wide synchronisation from black and burst signals. So it’s no surprise that The Broadcast Bridge invited Meinberg to speak at their ‘Real World IP’ event exploring all aspects of video over IP.

David Boldt, head of software engineering at Meinberg, explains how you can accurately transmit time over a network. He summarises the way that PTP accounts for the time taken for messages to move from A to B. David covers different types of clock explaining the often-heard terms ‘boundary clock’ and ‘transparent clock’ exploring their pros and cons.

Unlike black and burst which is a distributed signal, PTP is a system with bi-directional communication which makes redundancy all the more critical and, in some ways, complicated. David talks about different ways to attack the main/reserve problem.

PTP is a cross-industry standard which needs to be interpreted by devices to map the PTP time into an understanding of how the signal should look in order for everything to be in time. SMPTE 2059 does this task which David cover.

PTP-over-WAN: David looks at a case study of delivering PTP over a WAN. Commonly assumed not practical by many, David shows how this was done without using a GPS antenna at the destination. To finish off the talk, there’s a teaser of the new features coming up in the backwards-compatible PTP Version 2.1 before a Q&A.

This is part of a series of videos from The Broadcast Bridge

Watch now!
Speakers

Daniel Boldt

Daniel Boldt
Head of Software Engineering
Meinberg

Video: ATSC 3.0

“OTT over the air” – ATSC 3.0 deployment has started in the US and has been deployed in Korea. Promising to bring interactivity and ‘internet-style’ services to broadcast TV, moreover allowing ‘TV’ to transition to mobile devices. To help understand what ATSC 3.0 enables, NABShow Live brings together Sinclair’s Mark Aitken, Bill Hayes from Iowa Public Television and SMPTE’s Thomas Bause Mason all of which are deeply involved in the development of ATSC 3.0.

The panelists dive in to what ATSC 1 was and how we get to 3.0, outlining the big things that have changed. One key thing is that broadcasters can now choose how robust the stream is, balanced against bandwidth. Not only that but multiple streams with different robustnesses are possible for the same channel. This allows ATSC 3.0 to be tailored to your market and support different business models.

ATSC 3.0, as Bill Hayes says was ‘built to evolve’ and to deal with new standards as they come along and was at pains to point out that all these advancements came without any extra spectrum allocations. Thomas outlined that not only is SMPTE on the board of ATSC, but the broadcast standards upstream of distribution now need to work and communicate with downstream. HDR, for instance, needs metadata and the movement of that is one of the standards SMPTE has formed. As Mark Aitken says ‘the lines are blurring’ with devices at the beginning of the end of the chain both being responsible for correct results on the TV.

The session ends by asking what the response has been from broadcasters. Are they embracing the standard? After all, they are not obliged to use ATSC 3.0.
Thomas say that interest has picked up and that large and small networks are now showing more interest with 50 broadcasters already having committed to it.

Watch now!
Speakers

Thomas Bause Mason Thomas Bause Mason
Director Standards Development,
SMPTE
Bill Hayes Bill Hayes
Director of Engineering & Technology
Iowa Public Television
Mark Aitken Mark Aitken
SVP of Advanced Technology,
Sinclair Broadcast Group
Linda Rosner Linda Rosner
Managing Director,
Artisans PR

Video: The 7th Circle of Hell; Making Facility-Wide Audio-over-IP Work

audio-over-ip

When it comes to IP, audio has always been ahead of video. Whilst audio often makes up for it in scale, its relatively low bandwidth requirements meant computing was up to the task of audio-over-IP long before uncompressed video-over-IP. Despite the early lead, audio-over-IP isn’t necessarily trivial. However, this talk aims to give you a heads up to the main hurdles so you can address them right from the beginning.

Matt Ward, Head of Video for UK-based Jigsaw24, starts this talk revising the reasons to go audio over IP (AoIP). The benefits vary for each company. For some, reducing cabling is a benefit, many are hoping it will be cheaper, for others achievable scale is key. Matt’s quick to point out the drawbacks we should be cautious of, not least of which are complexity and skill gaps.

Matt fast-tracks us to better installations by hitting a list of easy wins some of which are basic, but a disproportionately important as the project continues i.e. naming paths and devices and having IP addresses in logical groups. Others are more nuanced like ensuring cable performance. For CAT6 cabling, it’s easy to get companies to test each of your cables to ensure the cable and all terminations are still working at peak performance.

Planning your timing system is highlighted as next on the road to success with smaller facilities more susceptible to problems if they only have one clock. But any facility has to be carefully considered and Matt points out that the Best Master Clock Algorithm (BMCA).

Network considerations are the final stop on the tour, underlining that audio doesn’t have to run in its own network as long as QoS is used to maintain performance. Matt details his reasons to keep Spanning Tree Protocol off, unless you explicitly know that you need it on. The talk finishes by discussing multicast distribution and IGMP snooping.

Watch now!
Speaker

Matt Ward Matt Ward
Head of Audio,
Jigsaw24