Video: Remote editing, storage, cloud dynamics & reopening production

The rug was pulled from under the feet of the production industry due to the pandemic, both in film and television. The scramble to finish projects and to fill TV schedules has resulted in a lot of creative ideas and a surge in remote editing. This panel looks at the benefits of this work and considers whether this will continue to be done in the future when the restrictions are lifted.

In this video, we hear from Sony, Teradici, Lou Wirth Productions, EditShare and PADEM Group on the gaping hole in workflows left by the pandemic and how the industry has bridged the gap with remote editing.

Moderated by IET Media Exec director Allan McLennan from PADEM group, we hear answers to questions like “What are the challenges moving to remote editing?”, “Can Remote Editing open up diversity in this part of the industry?” and features a look to the future in terms of new technologies for meeting the streaming demand.

“One of the challenges with a technology transition is people often need a motivation”

Stephen Tallamy, EditShare

“It’s easy to keep doing the thing you used to do until you’re forced to do it,” explains EditShare’s Stephen Tallamy. But the panel doesn’t see the pandemic as just something that forced a change, rather they see the benefits in the move towards remove editing and remote collaboration. David Rosen from Sony was positive saying that “Creative resources can be anywhere and the elimination of having to move those people to where the content it…is a significant advantage.” From his perspective, increasing numbers of customers have cloud as part of their workflow.

“Never again.” My customers are saying, “Never again will I be in a situation where I cannot get access to. my content.”

David Rosen, Sony

The panel’s discussion moves to remote editing, the practice of giving editors access to remote computers which run the editing software and have access to the relevant media. The editor’s local computer then becomes a window on to the edit suite in a different building, or in the cloud. Ian Main from Teradici, explains that a company can open an edit station up to an editor who could be anywhere in the world which is why this is such an important part of the solution to enabling work to continue in an emergency. Teradici specialises in developing and deploying high-performance remote control of PCs and Stephen Tallamy speaks from the experience of enabling remote editing using Teradici for enabling remote editing workflows on AWS and other cloud providers and data centres.

“The production side shut down, but the post-production side accelerated.”

Ian Main, Teradici
Lou Wirth, award-winning editor and producer, joins the panel as someone who has continued to edit locally. “For producers who were forced to go into a remote editing situation, they may have always been on the fence about it”, Lou says, “…If it was a good experience, they would see the advantages of it and continue.” Indeed the consensus does seem to be that much of what’s happening now will be fed back into workflows of the future even when restrictions are lifted.

Listen to the whole discussion which includes a look ahead to IBC.

Watch now!
Speakers

Ian Main Ian Main
Technical Marketing Principle,
Teradici
David Rosen David Rosen
VP, Cloud Applications & Solutions,
Sony
Stephen Tallamy Stephen Tallamy
Chief Technology Officer,
EditShare
Lou Wirth Lou Wirth
Head Story Editor,
Lou Wirth Productions
Allan McLennan Moderator: Allan McLennan
Chief Executive, Global Market Technologist, PADEM Media Group,
Executive Board Director, IET Media technology network

Video: Maintaining Colour Spaces

Getting colours right is tricky. Many of us get away without considering colour spaces both in our professional and personal life. But if you’ve ever wanted to print a logo which is exactly the right colour, you may have found out the hard way that the colour in your JPEG doesn’t always match the CMYK of the printer. Here, we’re talking, of course about colour in video. With SD’s 601 and HD’s 709 colour space, how do we keep colours correct?

Rec. ITU-R BT.601-7 also known as REC 601 is the colour space standardised for SD video, Rec. ITU-R T.709-6 also known as Rec. 709 is typically used for HD video. Now for anyone who wants to brush up on what a colour space is, check out this excellent talk from Vimeo’s Vittorio Giovara. A great communicator, we have a number of other talks from him.

In this talk starting 28 minutes into the Twitch feed, Matt Szatmary exposes a number of problems. The first is the inconsistent, and sometimes wrong, way that browsers interpret colours in videos. Second is that FFmpeg only maintains colour space information in certain circumstances and, lastly, he exposes the colour changes that can occur when you’re not careful about maintaining the ‘chain of custody’ of colour space information.

Matt starts by explaining that the ‘VUI’ information, the Video Usability Information, found in AVC and HEVC conveys colour space information among other things such as aspect ratio. This was new to AVC and are not used by the encoder but indicate to decoders things to consider during the decoder process. We then see a live demonstration of Matt using FFmpeg to move videos through different colour spaces and the immediate results in different browsers.

This is an illuminating talk for anyone who cares about actually displaying the correct colours and brightnesses, particularly given there are many processes based on FFmpeg. Matt demonstrates how to ensure FFmpeg is maintaining the correct information.

Watch now!
Download the scripts used in the video
Speakers

Matt Szatmary Matt Szatmary
Senior Video Encoding Engineer,
Mux

Video: A State-of-the-Industry Webinar: Apple’s LL-HLS is finally here

Even after restrictions are lifted, it’s estimated that overall streaming subscriptions will remain 10% higher than before the pandemic. We’ve known for a long time that streaming is here to stay and viewers want their live streams to arrive quickly and on-par with broadcast TV. There have been a number of attempts at this, the streaming community extended HLS to create LHLS which brought down latency quite a lot without making major changes to the defacto standard.

MPEG’s DASH also has created a standard for low-latency streaming allowing CMAF to be used to get the latency down even further than LHLS. Then Apple, the inventors of the original HLS, announced low-latency HLS (LL-HLS). We’ve looked at all of these previously here on The Broadcast Knowledge. This Online Streaming Primer is a great place to start. If you already know the basics, then there’s no better than Will Law to explain the details.

The big change that’s happened since Will Law’s talk above, is that Apple have revised their original plan. This talk from CTO and Founder of THEOplayer, Pieter-Jan Speelmans, explains how Apple’s modified its approach to low-latency. Starting with a reminder of the latency problem with HLS, Pieter-Jan explains how Apple originally wanted to implement LL-HLS with HTTP/2 push and the problems that caused. This has changed now, and this talk gives us the first glimpse of how well this works.

Pieter-Jan talks about how LL-DASH streams can be repurposed to LL-HLS, explains the protocol overheads and talks about the optimal settings regarding segment and part length. He explains how the segment length plays into both overall latency but also start-up latency and the ability to navigate the ABR ladder without buffering.

There was a lot of frustration initially within the community at the way Apple introduced LL-HLS both because of the way it was approached but also the problems implementing it. Now that the technical issues have been, at least partly, addressed, this is the first of hopefully many talks looking at the reality of the latest version. With an expected ‘GA’ date of September, it’s not long before nearly all Apple devices will be able to receive LL-HLS and using the protocol will need to be part of the playbook of many streaming services.

Watch now to get the full detail

Speaker

Pieter-Jan Speelmans Pieter-Jan Speelmans
CTO & Founder
THEOplayer

Video: Bandwidth Prediction for Multi-Bitrate Streaming at Low Latency

Low latency protocols like CMAF are wreaking havoc with traditional ABR algorithms. We’re having to come up with new ways of assessing if we’re running out of bandwidth. Traditionally, this is done by looking at how long a video chunk takes to download and comparing that with its playback duration. If you’re downloading at the same speed it’s playing, it’s time consider changing stream to a lower-bandwidth one.

As latencies have come down, servers will now start sending data from the beginning of a chunk as it’s being written which means it’s can’t be downloaded any quicker. To learn more about this, look at our article on ISO BMFF and this streaming primer. Since the file can’t be downloaded any quicker, we can’t ascertain if we should move up in bitrate to a better quality stream, so while we can switch down if we start running out of bandwidth, we can’t find a time to go up.

Ali C. Begen and team have been working on a way around this. The problem is that with the newer protocols, you pre-request files which start getting sent when they are ready. As such you don’t actually know the time the chunk starts downloading to you. Whilst you know when it’s finished, you don’t have access, via javascript, to when the file started being sent to you robbing you of a way of determining the download time.

Ali’s algorithm uses the time the last chunk finished downloading in place of the missing timestamp figuring that the new chunk is going to load pretty soon after the old. Now, looking at the data, we see that the gap between one chunk finishing and the next one starting does vary. This lead Ali’s team to move to a sliding window moving average taking the last 3 download durations into consideration. This is assumed to be enough to smooth out some of those variances and provides the data to allow them to predict future bandwidth and make a decision to change bitrate or not. There have been a number of alternative suggestions over the last year or so, all of which perform worse than this technique called ACTE.

In the last section of this talk, Ali explores the entry he was part of into a Twitch-sponsored competition to keep playback latency close to a second in test conditions with varying bitrate. Playback speed is key to much work in low-latency streaming as it’s the best way to trim off a little bit of latency when things are going well and allows you to buy time if you’re waiting for data; the big challenge is doing it without the viewer noticing. The entry used a heuristics and a machine learning approach which worked so well, they were runners up in the contest.

Watch now!
Speaker

Ali C. Begen
Ali C. Begen,
Technical Consultant, Comcast
Professor, Computer Science, Özyeğin University