Video: Moving Live Video Quality Control from the Broadcast Facility to the Living Room

Moving an 24×7 on-site MCR into people’s rooms is not trivial, but Disney Streaming Services, in common with most broadcasters knew they had to move people home, often into their living rooms. Working in an MCR requires watching incoming video to check content which is not easy to do at home, particularly when some of their contribution arrives at 100 Mb/s. These two MCRs in San Francisco and NYC covering Hulu Live & ESPN+ along with other services had two weeks to move remote.

Being a major streaming operator, DSS had their own encoding product called xCoder. DSS soon realised this would be their ticket to making home working viable. As standard, these encoders reject any video which doesn’t match a small range of templates. Michael Rappaport takes us how they wrote scripts to use ffprobe to analyse the desired video and then configure the xCoder just the right way. The incoming video goes straight to xCoder without being ‘groomed’ as it normally wood to add closed captions, ABR etc.

Aside from bandwidth, it was also important to provide these streams as close to real-time as possible, as the operators needed to see ‘right now’ to do their job effectively. This is why the ‘grooming’ section is skipped as that would add latency but also the added functions such as PID normalisation and closed caption insertion aren’t needed. Michael explains that when a feed is needed, it will call out to the whole encoder pool, find an underutilised one and then can program it automatically using an API.

Watching this at home was made possible by some work done by Disney Streaming Services to allow their player to receive feeds directly from an xCoder without having any problems decoder parameters. Michael doesn’t mention what protocol they use, but as the xCoder creates a proprietary video stream, so they could be used that carried over TCP.

Made their own players to the receiver from the xCoders. xCoder, as a standalone, produces a proprietary TCP stream. xCoder exposes an API hook that allows us to quickly determine things like frame rate, resolution, and even whether or not the xCoder is able to subscribe to the stream

Watch now!
Speakers

Michael Rappaport Michael Rappaport
Senior Manager, Encoding Administration,
Disney Streaming Services

Video: FOX – Uncompressed live sports in the cloud

Is using uncompressed video in the cloud with just 6 frames of latency to get there and back ready for production? WebRTC manages sub-second streaming in one direction and can even deliver AV1 in real-time. The key to getting down to a 100ms round trip is to move down to millisecond encoding and to use uncompressed video in the cloud. This video shows how it can be done.

Fox has a clear direction to move into the cloud and last year joined AWS to explains how they’ve put their delivery distribution into the cloud remuxing feeds for ATSC transmitters, satellite uplinks, cable headends and encoding for internet delivery, In this video, Fox’s Joel Williams, Evan Statton from AWS explain their work together making this a reality. Joel explains that latency is not a very hot topic for distribution as there are many distribution delays. The focus has been on getting the contribution feeds into playout and MCR monitoring quickly. After all, when people are counting down to an ad break, it needs to roll exactly on zero.

Evan explains the approach AWS has taken to solving this latency problem and starts with considering using SMPTE’s ST 2110 in the cloud. ST 2110 has video flows of at least 1 Gbps, typically and when implemented on-premise is typically built on a dedicated network with very strict timing. Cloud datacentres aren’t like that and Evan demonstrates this showing how across 8 video streams, there are video drops of several seconds which is clearly not acceptable. Amazon, however, has a product called ‘Scalable Reliable Datagram’ which is aimed at moving high bitrate data through their cloud. Using a very small retransmission buffer, it’s able to use multiple paths across the network to deliver uncompressed video in real-time. The retransmission buffer here being very small enables just enough healing to redeliver missing packets within the 16.7ms it takes to deliver a frame of 60fps video.

On top of SRD, AWS have introduced CDI, the Cloud Digital Interface, which is able to describe uncompressed video flows in a way already familiar to software developers. This ‘Audio Video Metadata’ layer handles flows in the same way as 2110, for instance keeping essences separate. Evan says this has helped vendors react favourably to this new technology. For them instead of using UDP, SRD can be used with CDI giving them not only normal video data structures but since SRD is implemented in the Nitro network card, packet processing is hidden from the application itself.

The final piece to the puzzle is keeping the journey into and out of the cloud low-latency. This is done using JPEG XS which has an encoding time of a few milliseconds. Rather than using RIST, for instance, to protect this on the way into the cloud, Fox is testing using ST 2022-7. 2022-7 takes in two identical streams on two network interfaces, typically. This way it should end up with two copies of each packet. Where one gets lost, there is still another available. This gives path redundancy which a single stream will never be able to offer. Overall, the test with Fox’s Arizona-based Technology Center is shown in the video to have only 6 frames of latency for the return trip. Assuming they used a California-based AWS data centre, the ping time may have been as low as two frames. This leaves four frames for 2022-7 buffers, XS encoding and uncompressed processing in the cloud.

Watch now!
Speakers

Joel Williams Joel Wiliams
VP of Architecutre & Engineering,
Fox Corporation
Evan Statton Evan Statton
Principal Architect, Media & Entertainment,
AWS

Video: State of the Streaming Market 2021

Streaming Media is back to take the pulse of the Streaming market following on from their recent, mid-year survey measuring the impact of the pandemic. This is the third annual snapshot of the state of the streaming market which will be published by Streaming Media in March. To give us this sneak peak, Eric Schumacher-Rasmussen is joined by colleague Tim Siglin and Harmonic Inc.’s Robert Gambino,

They start off with a look at the demographics of the respondents. It’s no surprise that North America is well represented as Streaming Media is US-based and both the USA and Canada have very strong broadcast markets in terms of publishers and vendors. Europe is represented to the tune of 14% and South America’s representation has doubled which is in line with other trends showing notable growth in the South American market. In terms of individuals, exec-level and ‘engineering’ respondents were equally balanced with a few changes in the types of institutions represented. Education and houses of worship have both grown in representation since the last survey.

Of responding companies, 66% said that they both create and distribute content, a percentage that continues to grow. This is indicative, the panel says, of the barrier to entry of distribution continuing to fall. CDNs are relatively low cost and the time to market can be measured in weeks. Answering which type of streaming they are involved in, live and on-demand were almost equal for the first time in this survey’s history. Robert says that he’s seen a lot of companies taking to using the cloud to deliver popups but also that streaming ecosystems are better attuned to live video than they used to be.

Reading the news, it seems that there’s a large migration into the cloud, but is that shown in the data? When asked about their plans to move to the cloud, around a third had already moved but only a quarter said they had no plans. This means there is plenty of room for growth for both cloud platforms and vendors. In terms of the service itself, video quality was the top ‘challenge’ identified followed by latency, scalability and buffering respectively. Robert points out better codecs delivering lower bitrates helps alleviate all of these problems as well as time to play, bandwidth and storage costs.

There have been a lot of talks on dynamic server-side ad insertion in 2020 including for use with targetted advertising, but who’s actually adopting it. Over half of respondents indicated they weren’t going to move into that sphere and that’s likely because many governmental and educational services don’t need advertising to start with. But 10% are planning to implement it within the next 12 months which represents a doubling of adoption, so growth is not slow. Robert’s experience is that many people in ad sales are still used to selling on aggregate and don’t understand the power of targetted advertising and, indeed, how it works. Education, he feels, is key to continuing growth.

The panel finishes by discussing what companies hope to get out of the move to virtualised or cloud infrastructure. Flexibility comes in just above reliability with cost savings only being third. Robert comes back to pop-up channels which, based on the release of a new film or a sports event, have proved popular and are a good example of the flexibility that companies can easily access and monetise. There are a number of companies that are heavily investing in private cloud as well those who are migrating to public cloud. Either way, these benefits are available to companies who invest and, as we’re seeing in South America, cloud can offer an easy on-ramp to expanding both scale and feature-set of your infrastructure without large Capex projects. Thus it’s the flexibility of the solution which is driving expansion and improvements in quality and production values.

Watch now!
Speakers

Tim Siglin Tim Siglin
Contributing Editor, Streaming Media Magazine
Founding Executive Director, HelpMeStream
Robert Gambino Robert Gambino
Director of Solutions,
Harmonic Inc.
Eric Schumacher-Rasmussen Moderator: Eric Schumacher-Rasmussen
Editor, Streaming Media

Video: Cloud Services for Media and Entertainment: Production and Post-Production

My content producers and broadcasters have been forced into the cloud. Some have chosen remote controlling their on-prem kit but many have found that the cloud has brought them benefits beyond simply keeping their existing workflows working during the pandemic.

This video from SMPTE’s New York section looks at how people moved production to the cloud and how they intend to keep it there. The first talk from WarnerMedia’s Greg Anderson discussing the engineering skills needed to be up to the task concluding that there are more areas of knowledge in play than one engineer can bring to the table from the foundational elements such as security, virtulisation nad networking, to DevOps skills like continuous integration and development (CI/CD), Active Directory and databases.

The good news is that whichever of the 3 levels of engineer that Greg introduces, from beginner to expert, the entry points are pretty easy to access to start your journey and upskilling. Within the company, Greg says that leaders can help accelerate the transition to cloud by allowing teams a development/PoC account which provides a ‘modest’ allowance each month for experimentation, learning and prooving ideas. Not only does that give engineers good exposure to cloud skills, but it gives managers experience in modelling, monitoring and analysing costs.

Greg finishes by talking through their work with implementing a cloud workflow for HBO MAX which is currently on a private cloud and on the way to being in the public cloud. The current system provides for 300 concurrent users doing Edit, Design, Engineering and QC workflows with asset management and ingest. They are looking to the public cloud to consolidate real estate and standardise the tech stack amongst many other drivers outlined by Greg.

Scott Bounds Architect at Microsoft Azure talks about content creation in the cloud. The objectives for Azure is to allow worldwide collaboration, speed up the time to market, allow scaling of content creation and bring improvements in security, reliability and access of data.

This starts for many by using hybrid workflows rather than a full switch to the cloud. After all, Scott says that rough cut editing, motion graphics and VFX are all fairly easy to implement in the cloud whereas colour grading, online and finishing are still best for most companies if they stay on-prem. Scott talks about implementing workstations in the cloud allowing GPU-powered workstations to be used using the remote KVM technology PCoIP to connect in. This type of workflow can be automated using Azure scripting and Terraform.

John Whitehead is part of the New York Times’ Multimedia Infrastructure Engineering team which have recently moved their live production to the cloud. Much of the output of the NYT is live events programming such as covering press conferences. John introduces their internet-centric microservices architecture which was already being worked on before the pandemic started.

The standard workflow was to have a stream coming into MCR which would then get routed to an Elemental encoder for sending into the cloud and distributed with Fastly. To be production-friendly they had created some simple-to-use web frontends for routing. For full-time remote production, John explains they wanted to improve their production quality by adding a vision mixer, graphics and closed captions. John details the solution they chose which comprised cloud-first solutions rather than running windows in the cloud.

The NYT was pushed into the cloud by Covid, but it was felt to be low risk and something they were considering doing anyway. The pandemic forced them to consider that perhaps the technologies they were waiting for had already arrived and ended up saving on Capex and received immediate returns on their investment.

Finishing up the presentations is Anshul Kapoor from Google Cloud who presents market analysis on the current state of cloud adoption and the market conditions. He says that one manifestation of the current crisis is that new live-events content is reduced if not postponed which is making people look to their archives. Some people have not yet done their archiving process, whilst some already have a digital archive. Google and other cloud providers can offer vast scale in order to process and manage archives but also machine learning in order to process, make sense and make searchable all the content.

The video ends with an extensive Q&A with the presenters.

Watch now!
Speakers

Greg Anderson Greg Anderson
Senior Systems Engineer,
WarnerMedia
Scott Bounds Scott Bounds
Media Cloud Architect,
Microsoft
John Whitehead John Whitehead
Senior Engineer, Multimedia Infrastructure Engineering,
New York Times
Anshul Kapoor Anshul Kapoor
Business Development,
Google Cloud