Video: Remote editing, storage, cloud dynamics & reopening production

The rug was pulled from under the feet of the production industry due to the pandemic, both in film and television. The scramble to finish projects and to fill TV schedules has resulted in a lot of creative ideas and a surge in remote editing. This panel looks at the benefits of this work and considers whether this will continue to be done in the future when the restrictions are lifted.

In this video, we hear from Sony, Teradici, Lou Wirth Productions, EditShare and PADEM Group on the gaping hole in workflows left by the pandemic and how the industry has bridged the gap with remote editing.

Moderated by IET Media Exec director Allan McLennan from PADEM group, we hear answers to questions like “What are the challenges moving to remote editing?”, “Can Remote Editing open up diversity in this part of the industry?” and features a look to the future in terms of new technologies for meeting the streaming demand.

“One of the challenges with a technology transition is people often need a motivation”

Stephen Tallamy, EditShare

“It’s easy to keep doing the thing you used to do until you’re forced to do it,” explains EditShare’s Stephen Tallamy. But the panel doesn’t see the pandemic as just something that forced a change, rather they see the benefits in the move towards remove editing and remote collaboration. David Rosen from Sony was positive saying that “Creative resources can be anywhere and the elimination of having to move those people to where the content it…is a significant advantage.” From his perspective, increasing numbers of customers have cloud as part of their workflow.

“Never again.” My customers are saying, “Never again will I be in a situation where I cannot get access to. my content.”

David Rosen, Sony

The panel’s discussion moves to remote editing, the practice of giving editors access to remote computers which run the editing software and have access to the relevant media. The editor’s local computer then becomes a window on to the edit suite in a different building, or in the cloud. Ian Main from Teradici, explains that a company can open an edit station up to an editor who could be anywhere in the world which is why this is such an important part of the solution to enabling work to continue in an emergency. Teradici specialises in developing and deploying high-performance remote control of PCs and Stephen Tallamy speaks from the experience of enabling remote editing using Teradici for enabling remote editing workflows on AWS and other cloud providers and data centres.

“The production side shut down, but the post-production side accelerated.”

Ian Main, Teradici
Lou Wirth, award-winning editor and producer, joins the panel as someone who has continued to edit locally. “For producers who were forced to go into a remote editing situation, they may have always been on the fence about it”, Lou says, “…If it was a good experience, they would see the advantages of it and continue.” Indeed the consensus does seem to be that much of what’s happening now will be fed back into workflows of the future even when restrictions are lifted.

Listen to the whole discussion which includes a look ahead to IBC.

Watch now!
Speakers

Ian Main Ian Main
Technical Marketing Principle,
Teradici
David Rosen David Rosen
VP, Cloud Applications & Solutions,
Sony
Stephen Tallamy Stephen Tallamy
Chief Technology Officer,
EditShare
Lou Wirth Lou Wirth
Head Story Editor,
Lou Wirth Productions
Allan McLennan Moderator: Allan McLennan
Chief Executive, Global Market Technologist, PADEM Media Group,
Executive Board Director, IET Media technology network

Video: Case Study on a Large Scale Distributed ST 2110 Deployment

We’re “past the early-adopter stage” of SMPTE 2110, notes Andy Rayner from Nevion as he introduces this case study of a multi-national broadcaster who’s created a 2110-based live production network spanning ten countries.

This isn’t the first IP project that Nevion have worked on, but it’s doubtless the biggest to date. And it’s in the context of these projects that Andy says he’s seen the maturing of the IP market in terms of how broadcasters want to use it and, to an extent, the solutions on the market.

Fully engaging with the benefits of IP drives the demand for scale as people are freer to define a workflow that works best for the business without the constraints of staying within one facility. Part of the point of this whole project is to centralise all the equipment in two, shared, facilities with everyone working remotely. This isn’t remote production of an individual show, this is remote production of whole buildings.

SMPTE ST-2110, famously, sends all essences separately so where an 1024×1024 SDI router might have carried 70% of the media between two locations, we’re now seeing tens of thousands of streams. In fact, the project as a whole is managing in the order of 100,000 connections.

With so many connections, many of which are linked, manual management isn’t practical. The only sensible way to manage them is through an abstraction layer. For instance, if you abstract the IP connections from the control, you can still have a panel for an engineer or operator which says ‘Playout Server O/P 3’ which allow you to route it with a button that says ‘Prod Mon 2’. Behind the scenes, that may have to make 18 connections across 5 separate switches.

This orchestration is possible using SDN – Software Defined Networking – where router decisions are actually taken away from the routers/switches. The problem is that if a switch has to decide how to send some traffic, all it can do is look at its small part of the network and do its best. SDN allows you to have a controller, or orchestrator, which understands the network as a whole and can make much more efficient decisions. For instance, it can make absolutely sure that ST 2022-7 traffic is routed separately by diverse paths. It can do bandwidth calculations to stop bandwidths being oversubscribed.

Whilst the network is, indeed, based on SMPTE ST 2110, one of the key enablers is JPEG XS for international links. JPEG XS provides a similar compression level to JPEG 2000 but with much less latency. The encode itself requires less than 1ms of latency unlike JPEG 2000’s 60ms. Whilst 60ms may seem small, when a video needs to move 4 or even 10 times as part of a production workflow, it soon adds up to a latency that humans can’t work with. JPEG XS promises to allow such international production to feel responsive and natural. Making this possible was the extension of SMPTE ST 2110, for the first time, to allow carriage of compressed video in ST 2110-22.

Andy finishes his overview of this uniquely large case study talking about conversion between types of audio, operating SDN with IGMP multicast islands, and NMOS Control. In fact, it’s NMOS which the answer to the final question asking what the biggest challenge is in putting this type of project together. Clearly, in a project of this magnitude, there are challenges around every corner, but problems due to quantity can be measured and managed. Andy points to NMOS adoption with manufacturers still needing to be pushed higher whilst he lays down the challenge to AMWA to develop NMOS further so that it’s extended to describe more aspects of the equipment – to date, there are not enough data points.

Watch now!
Speakers

Andy Rayner Andy Rayner
Chief Technologist,
Nevion

Video: 5G Technology

5G seems to offer so much, but there is a lot of nuance under the headlines. Which of the features will telcos actually provide? When will the spectrum become available? How will we cope with the new levels of complexity? Whilst for many 5G will simply ‘work’, when broadcasters look to use it for delivering programming, they need to look a few levels deeper.

In this wide-ranging video from the SMPTE Toronto Section, four speakers take us through the technologies at play and they ways they can be implemented to cut through the hype and help us understand what could actually be achieved, in time, using 5G technology.

Michael J Martin is first up who covers topics such as spectrum use, modulation, types of cells, beam forming and security. Regarding spectrum, Michael explains that 5G uses three frequency bands, the sub 1GHz spectrum that’s been in use for many years, a 3Ghz range and a millimetre range at 26Ghz.

“It’s going to be at least a decade until we get 5G as wonderful as 4G is today.”

Michael J Martin
Note that some countries already use other frequencies such as 1.8GHz which will also be available.The important issue is that the 26Ghz spectrum will typically not be available for over a year, so 5G roll-out starts in some of the existing bands or in the 3.4Ghz spectrum. A recurring theme in digital RF is the use of OFDM which has long been used by DVB and has been adopted by ATSC 3.0 as their modulation, too. OFDM allows different levels of robustness so you can optimise reach and bandwidth.

Michael highlights a problem faced in upgrading infrastructure to 5G, the amount of towers/sites and engineer availability. It’s simply going to take a long time to upgrade them all even in a small, dense environment. This will deal with the upgrade of existing large sites, but 5G provides also for smaller cells, (micro, pico and femto cells). These small cells are very important in delivering the millimetre wavelength part of the spectrum.

Network Slicing
Source: Michael J. Martin, MICAN Communications

We look at MIMO and beam forming next. MIMO is an important technology as it, effectively, collects reflected versions of the transmitted signals and processes them to create stronger reception. 5G uses MIMO in combination with beam forming where the transmitter itself electronically manipulates the transmitter array to focus the transmission and localise it to a specific receiver/number of receivers.

Lastly, Michael talks about Network Slicing which is possibly one of the most anticipated features of 5G by the broadcast community. The idea being that the broadcaster can reserve its own slice of spectrum so when sharing an environment with 30,000 other receivers, they will still have the bandwidth they need.

Our next speaker is Craig Snow from Huawei outlines how secondary networks can be created for companies for private use which, interestingly, partly uses separate frequencies from public network. Network slicing can be used to separate your enterprise 5G network into separate networks fro production, IT support etc. Craig then looks at the whole broadcast chain and shows where 5G can be used and we quickly see that there are many uses in live production as well as in distribution. This can also mean that remote production becomes more practical for some use cases.

Craig moves on to look at physical transmitter options showing a range of sub 1Kg transmitters, many of which have in-built Wi-Fi, and then shows how external microwave backhaul might look for a number of your buildings in a local area connecting back to a central tower.

Next is Sayan Sivanathan who works for Bell Mobility and goes in to more detail regarding the wider range of use cases for 5G. Starting by comparing it to 4G, highlighting the increased data rates, improved spectrum efficiency and connection density of devices, he paints a rosy picture of the future. All of these factors support use cases such as remote control and telemetry from automated vehicles (whether in industrial or public settings.)  Sayan then looks at the deployment status in the US, Europe and Korea. He shows the timeline for spectrum auction in Canada, talks through photos of  5G transmitters in the real world.

Global Mobile Data Traffic (Exabytes per month)
Source: Ericsson Mobility Report, Nov 2019

Finishing off today’s session is Tony Jones from MediaKind who focuses in on which 5G features are going to be useful for Media and Entertainment. One is ‘better video on mobile’. Tony picks up on a topic mentioned by Michael at the beginning of the video: processing at the edge. Edge processing, meaning having compute power at the closest point of the network to your end user allows you to deliver customised manifest and deal with rights management with minimal latency.

Tony explains how MediaKind worked with Intel and Ericsson to deliver 5G remote production for the 2018 US Open. 5G is often seen as a great way to make covering golf cheaper, more aesthetically pleasing and also quicker to rig.

The session ends with a Q&A

Watch now!
Speakers

Michael J Martin Michael J Martin
MICAN Communications
Blog: vividcomm.com
Tony Jones Tony Jones
Principal Technologist
MediaKind Global
Craig Snow Craig Snow
Enterprise Accounts Director,
Huawei
Sayan Sivanathan Sayan Sivanathan
Senior Manager – IoT, Smart Cities & 5G Business Development
Bell Mobility

Video: Case Study FIS Ski World Championship

There’s a lot to learn when it comes to implementing video over IP, so it’s healthy to stand back from the details and see a working system in use to understand how the theory becomes reality. There’s been a clear change in the tone of conversation at the IP Showcase over the years as we’ve shifted from ‘trust us, this could work’ to ‘this is what it looks like!’ That’s not to say there’s not plenty to be done, but this talk about an uncompressed 2110 remote production workflow is great example of how the benefits of IP are being realised by broadcasters.
Robert Erickson is with Grass Valley specialising in sports such as the FIS Alpine World Ski Championships which were in the city of Åre in Sweden some 600km from Stockholm where Sweden’s public broadcaster SVT is based. With 80 cameras at the championships to be remotely controlled over an uncompressed network, this was no small project. Robert explains the two locations were linked by a backbone of two 100Gbps circuits.

The principle behind SVT’s project was to implement a system which could be redeployed, wouldn’t alter the viewers’ experience and would reduce staff and equipment on site. Interestingly the director wanted to be on-site meaning that the production was then split between much of the staff being in Stockholm, which of course was where most of the equipment was, and Åre. The cameras were natively IP, so no converters were needed in the field.

Centralisation was the name of the game, based in Stockholm, producing an end-to-end IP chain. Network switching was provided by Arista which aggregated the feeds of the cameras and brought them to Stockholm where the CCUs were located. Robert highlights the benefits of this approach which include the use of COTS switches, scalability and indifference as to the circuits in use. We then have a look inside the DirectIP connection which is a 10gig ‘pipe’ carrying 2022-6 camera and return feeds along with control and talkback, replicating the functionality of a SMPTE fibre in IP.

To finish up, Robert talks about the return visions, including multivewers, which were sent back to Åre. A Nimbra setup was used to take advantage of a lower-bandwidth circuit using JPEG 2000 to send the vision back. In addition, it carried the data to connect the vision mixer/switcher at Åre with the switch at Stockholm. This was the only point at which noticeable latency was introduced to the tune of around 4 frames.

Watch now!
Download the presentation
Speakers

Robert Erickson Robert Erickson
Strategic Account Manager Sports and Venues,
Grass Valley