Video: Distribution in a Covid-19 World

A look at the impacts of Covid-19 from the perspective of Disney+ and ESPN+. In this talk Eric Klein from Disney Streaming Services gives his view on the changes and learnings he saw as Covid hit and as it continues. He first comments on the increase in ‘initial streams’ as the lockdowns hit with Austria topping the list with a 44% increase of time spent streaming within a just a 48-hour period and in the US, Comcast has reported an uptick of 38% in general streaming and web video consumption. Overall fixed broadband networks tended to do better with the peaks than mobile broadband, whereas mobile internet which is quite common in Italy was observed to be suffering.

Distribution in a Covid-19 World from Streaming Video Alliance on Vimeo.

Content providers played their part to help with the congestion in adjusting to the situation by altering video profiles and changing starting bitrates as part of an industry-wide response. And it’s this element of everybody playing their part which seems to be the secret sauce behind Eric’s statement that “the internet is more resilient than everybody thought”. Eric goes on to point out that such networks are designed to deal with these situations as the first question is always “what’s your peak traffic going to be”. Whilst someone’s estimates may be off, the point is that networks are provisioned for peaks so when many peak forecasts come to pass, their average is usually within the network’s capabilities. The exceptions come on last-mile links which are much more fixed than provisioning of uplink ports and router backplane bandwidth within datacentres.

Eric points out the benefits of open caching, a specification in development within the Streaming Video Alliance. Open caching allows for an interoperable way of delivering files into ISP, modelled around popular data, so that services can cache data much closer to customers. By doing this, Eric points to data which has shown an ability to deliver an up to 15% increase in bandwidth as well as a 30% decrease in ‘customer-impacting events.

This session ends with a short Q&A

Watch now!

Eric Klein Eric Klein
Co-Chair, Open Caching Workgroup, Streaming Video Alliance,
Director, Content Distribution, Disney Streaming Services
Jason Thibeault Moderator: Jason Thibeault
Executive Director,
Streaming Video Alliance

Video Case Study: How BT Sport de-centralised its football production

We’ve all changed the way we work during the pandemic, some more than others. There’s nothing better than a real-life case study to learn from and to put your own experience into perspective. In this video, BT Sport and their technology provider Timeline TV take us through what they have and haven’t done to adapt.

Jamie Hindhaugh, COO of BT Sport explains that they didn’t see working at home as simply a decentralisation, but rather a centralisation of the technology to be used by a decentralised body of staff. This concept is similar to Discovery’s recent Eurosport IP transformation project which has all participating countries working from equipment in two datacentres. BT Sport managed to move from a model of two to three hundred people in the office daily to producing a live football talk show from presenters’ homes, broadcast staff also at home, in only 10 days. The workflow continued to be improved over the following 6 weeks at which point they felt they had migrated to an effective ‘at home’ workflow.



Speaking to the challenges, Dan McDonnell CEO of Timeline TV said that basic acquisition and distribution of equipment like laptops was tricky since everyone else was doing this, too. But once the equipment was in staff homes, they soon found out the problems moving out of a generator-backed broadcast facility. UPSes were distributed to those that needed them but Dan notes there was nothing they could do to help with the distraction of working with your children and/or pets.

Jamie comments that connectivity is very important and they are moving forward with a strategy called ‘working smart’ which is about giving the right tools to the right people. It’s about ensuring people are connected wherever they are and with BT Sport’s hubs around the country, they are actively looking to provide for a more diverse workforce.

BT Sport has a long history of using remote production, Dan points out which has driven BT Sport’s recent decision to move to IP in Stratford. Premiership games have changed from being a main and backup feed to needing 20 cameras coming into the building. This density of circuits in both HD and UHD has made SDI less and less practical. Jamie highlights the importance of their remote production heritage but adds that the pandemic meant remote production went way beyond normal remote productions now that scheduling and media workflows also has to be remote which would always have stayed in the building normally.

Dan says that the perspective has changed from seeing production as either a ‘studio’ or ‘remote OB’ production to allowing either type of production to pick and choose the best combination of on-site roles and remote roles. Dan quips that they’ve been forced to ‘try them all’ and so have a good sense of which work well and which benefit from on-site team working.

Watch now!

Dan McDonnell Dan McDonnell
Timeline TV
Jamie Hindhaugh Jamie Hindhaugh
BT Sport
Heather McLean Moderator: Heather McLean
SVG Europe

Video: Moving Live Video Quality Control from the Broadcast Facility to the Living Room

Moving an 24×7 on-site MCR into people’s rooms is not trivial, but Disney Streaming Services, in common with most broadcasters knew they had to move people home, often into their living rooms. Working in an MCR requires watching incoming video to check content which is not easy to do at home, particularly when some of their contribution arrives at 100 Mb/s. These two MCRs in San Francisco and NYC covering Hulu Live & ESPN+ along with other services had two weeks to move remote.

Being a major streaming operator, DSS had their own encoding product called xCoder. DSS soon realised this would be their ticket to making home working viable. As standard, these encoders reject any video which doesn’t match a small range of templates. Michael Rappaport takes us how they wrote scripts to use ffprobe to analyse the desired video and then configure the xCoder just the right way. The incoming video goes straight to xCoder without being ‘groomed’ as it normally wood to add closed captions, ABR etc.

Aside from bandwidth, it was also important to provide these streams as close to real-time as possible, as the operators needed to see ‘right now’ to do their job effectively. This is why the ‘grooming’ section is skipped as that would add latency but also the added functions such as PID normalisation and closed caption insertion aren’t needed. Michael explains that when a feed is needed, it will call out to the whole encoder pool, find an underutilised one and then can program it automatically using an API.

Watching this at home was made possible by some work done by Disney Streaming Services to allow their player to receive feeds directly from an xCoder without having any problems decoder parameters. Michael doesn’t mention what protocol they use, but as the xCoder creates a proprietary video stream, so they could be used that carried over TCP.

Made their own players to the receiver from the xCoders. xCoder, as a standalone, produces a proprietary TCP stream. xCoder exposes an API hook that allows us to quickly determine things like frame rate, resolution, and even whether or not the xCoder is able to subscribe to the stream

Watch now!

Michael Rappaport Michael Rappaport
Senior Manager, Encoding Administration,
Disney Streaming Services

Video: In Stadium Production Workflow and COVID 19

As COVID-19 has wrought many changes to society, this webinar looks to see how it’s changed the way live sports is produced by the in-stadium crews producing the TV shows the cover the ups and downs of our beloved teams. We know that crews have had to be creative, but what has that actually looked like and has it all been negative?

The SMPTE Philadelphia section first invites Martin Otremsky to talk about how his work within the MLB has changed. Before the beginning of the season, Martin and team weren’t allowed in the stadium and had very little notice of the first game for which they were allowed to prepare.

Crowd noise was a big issue. Not only was the concern that the players would find it offputting in a silent stadium, but it was soon realised that swearing and tactics chat could easily be heard by other players and the mics. Bringing back crowd noise helped mask that and allow the teams to talk normally.

The crowd noise was run off three iPads. One which ran a 3-hour loop of general crowd noise and then two which had touch screen buttons to trigger different types/moods of effects dealing with the noises of anticipation and reactions, when the crowd would be at ease and when they would be ‘pumped’. This was run on the fly by two operators who would keep this going throughout the game. The crowd noise required a fair bit of fine-tuning including getting the in-stadium acoustics right as the speaker system is set up to assume the stands are lined with absorbent people; without a crowd, the acoustics are much more echoey.

Link to video

The Covid protections dealt with people in 3 tiers. Tier 1 was for the players and coaches all of whom were tested every 48 hours and where they would need to be. Tier 3 was for people who could go everywhere not in Tier 1. Tier 2 was one or two people who were allowed in both but under extreme restrictions. As such the crew in Tier 3 found it hard to do a lot of maintenance/configuration in certain places as they had to leave every time Tier 1 staff needed to be in the area.

The operation itself had been pared down from 28 to 9 people which was partly achieved by simplifying the programme itself. The ballpark often had to be flipped to accommodate another team using it as their ‘home’ stadium which caused a lot of work as graphics would have to be reworked to fit the in-stadium graphics boards. Crowd noise would have to cheer for a different team and the video & production graphics had to be swapped out. Matin ends his presentation with a Q&A session.

Next up is Carl Mandell for the Philadelphia Union football/soccer team. They had been lucky enough to be at the end of a stadium technical refresh of their LED displays and their control room, moving up to 1080p60 HDR. Alas, the COVID restrictions hit just before the home opener, robbing them of the chance to use all the new equipment. Their new cameras, for instance, remained unused until right near the end of the season.

Link to video

Unlike the MLB, Carl explains they didn’t choose to do crowd audio in the stadium itself. They ran virtual graphics to fulfil contractual obligations and to brighten up the relatively empty stadium. Unlike MLB, they were able to have a limited crowd in the stadium during most matches.

For the crowd noise used in the broadcast programme, they used audio taken from their ten-year archive of previous games and allowed the chants and reactions to be under the control of fans.

One COVID success was moving the press conferences on Zoom. Whilst the spokesperson/people were actually in the stadium with a PTZ camera in front of them and an 85″ TV to the side, all the media were brought in on Zoom. Carl says they found that this format produced much more engaging conferences which increased their international appeal and raised the viewership of the press conferences.

Watch now!

Carl Mandell Carl Mandell
Director, Broadcasting & Video Production
Philadelphia Union
Martin Otremsky Martin Otremsky
Director of Video Engineering,
Philadelphia Phillies