Video: Moving Live Video Quality Control from the Broadcast Facility to the Living Room

Moving an 24×7 on-site MCR into people’s rooms is not trivial, but Disney Streaming Services, in common with most broadcasters knew they had to move people home, often into their living rooms. Working in an MCR requires watching incoming video to check content which is not easy to do at home, particularly when some of their contribution arrives at 100 Mb/s. These two MCRs in San Francisco and NYC covering Hulu Live & ESPN+ along with other services had two weeks to move remote.

Being a major streaming operator, DSS had their own encoding product called xCoder. DSS soon realised this would be their ticket to making home working viable. As standard, these encoders reject any video which doesn’t match a small range of templates. Michael Rappaport takes us how they wrote scripts to use ffprobe to analyse the desired video and then configure the xCoder just the right way. The incoming video goes straight to xCoder without being ‘groomed’ as it normally wood to add closed captions, ABR etc.

Aside from bandwidth, it was also important to provide these streams as close to real-time as possible, as the operators needed to see ‘right now’ to do their job effectively. This is why the ‘grooming’ section is skipped as that would add latency but also the added functions such as PID normalisation and closed caption insertion aren’t needed. Michael explains that when a feed is needed, it will call out to the whole encoder pool, find an underutilised one and then can program it automatically using an API.

Watching this at home was made possible by some work done by Disney Streaming Services to allow their player to receive feeds directly from an xCoder without having any problems decoder parameters. Michael doesn’t mention what protocol they use, but as the xCoder creates a proprietary video stream, so they could be used that carried over TCP.

Made their own players to the receiver from the xCoders. xCoder, as a standalone, produces a proprietary TCP stream. xCoder exposes an API hook that allows us to quickly determine things like frame rate, resolution, and even whether or not the xCoder is able to subscribe to the stream

Watch now!
Speakers

Michael Rappaport Michael Rappaport
Senior Manager, Encoding Administration,
Disney Streaming Services

Video: The Fundamentals of Virtualisation

Virtualisation is continuing to be a driving factor in the modernisation of broadcast workflows both from the technical perspective of freeing functionality from bespoke hardware and from the commercial perspective of maximising ROI by increasing utilisation of infrastructure. Virtualisation itself is not new, but using it in broadcast is still new to many and the technology continues to advance to deal with modern bitrate and computation requirements.

In these two videos, Tyler Kern speaks to Mellanox’s Richard Hastie, NVIDIA’s Jeremy Krinitt and John Naylor from Ross Video explain how virtualisation fits with SMPTE ST 2110 and real-time video workflows.

Richard Hastie explains that the agility is the name of the game by separating the software from hardware. Suddenly your workflow, in principle can be deployed anywhere and has the freedom to move within the same infrastructure. This opens up the move to the cloud or to centralised hosting with people working remotely. One of the benefits of doing this is the ability to have a pile of servers and continually repurpose them throughout the day. Rather than have discrete boxes which only do a few tasks, often going unused, you can now have a quota of compute which is much more efficiently used so the return on investment is higher as is the overall value to the company. As an example, this principle is at the heart of Discovery’s transition of Eurosport to ST 2110 and JPEG XS. They have centralised all equipment allowing for the many countries around Europe which have production facilities to produce remotely from one, heavily utilised, set of equipment.

Part I

John Naylor explains the recent advancements brought to the broadcast market in virtualisation. vMotion from VMware allows live-migration of virtual. machines without loss of performance. When you’re running real-time graphics, this is really important. GPU’s are also vital for graphics and video tasks. In the past, it’s been difficult for VMs to have full access to GPUs, but now not only is that practical but work’s happened to allow a GPU to be broken up and these reserved partitions dedicated to a VM using NVIDIA Ampere architecture.
John continues by saying that VMWare have recently focussed on the media space to allow better tuning for the hypervisor. When looking to deploy VM infrastructures, John recommends that end-users work closely with their partners to tune not only the hypervisor but the OS, NIC firmware and the BIOS itself to deliver the performance needed.

“Timing is the number one challenge to the use of virtualisation in broadcast production at the moment”

Richard Hastie

Mellanox, now part of NVIDIA, has continued improving its ConnectX network cards, according to Richard Hastie, to deal with the high-bandwidth scenarios that uncompressed production throws up. These network cards now have onboard support for ST 2110, traffic shaping and PTP. Without hardware PTP, getting 500-nanosecond-accurate timing into a VM is difficult. Mellanox also use SR-IOV, a technology which bypasses the software switch in the hypervisor, reducing I/O overhead and bringing performance close to non-virtualised performance. It does this by partitioning the PCI bus meaning one NIC can present itself multiple times to the computer and whilst the NIC is shared, the software has direct access to it. For more information on SR-IOV, have a look at this article and this summary from Microsoft.

Part II

Looking to the future, the panel sees virtualisation supporting the deployment of uncompressed ST 2110 and JPEG XS workflows enabling a growing number of virtual productions. And, for virtualisation itself, a move down from OS-level virtualisation to containerised microservices. Not only can these be more efficient but, if managed by an orchestration layer, allow for processing to move to the ‘edge’. This should allow some logic to happen. much closer to the end-user at the same time as allowing the main computation to be centralised.

Watch part I and part II now!
Speakers

Tyler Kern Tyler Kern
Moderator
John Naylor John Naylor
Technology Strategist & Director of Product Security
Ross
Richard Hastie Richard Hastie
Senior Sales Director, Business Development
NVIDIA
Jeremy Krinitt Jeremy Krinitt
Senior Developer Relations Manager
NVIDIA

Video: Where The Puck Is Going: What’s Next for Esports & Sports Streaming

How’s sports streaming changing as the pandemic continues? Esports has the edge on physical sports as it allows people to compete from diverse locations. But both physical and esports benefit from bringing people into one place and getting the fans to see the players.

This panel from Streaming Media, moderated by Jeff Jacobs, looks at how producers, publishers, streamers and distributors reacted to 2020 and where they’re positioning themselves to be ahead in 2021. The panel opens by looking at the tools and the preferred workflows. There are so many ways to do remote production. Sam Asfahani from OS Studios, explained how they had already adopted some remote workflows to keep costs down but he has been impressed at the number of innovations released which help improve remote production. He explains they have a physical NDI Control room where they also use VMix for contribution. The changed workflows during the pandemic have convinced them that the second control room they were planning to build should now be in the cloud.

Aaron Nagler from Cheesehead TV discussed the way he’s stopped flying to watch games and has changed to watching syncronised using LiveX Director with his co-presenter. Within a few milliseconds, he is seeing the same footage so they can both present and comment in real-time. Intriguingly, Tyler Champley from Poker Central explains that, for them, remote production hasn’t been needed since the tournaments have been canceled and they use their studio facilities. Their biggest issue is that their players need to be in the same room to play the game, close to each other and without masks.

Link to video

The panel discusses what will stick after the pandemic. Sam makes the point that he’s gone from paying $20,000 for a star to stay overnight and be part of the show. The pandemic has made it so that sports stars are happy to be paid $5,000 for the two hours on a programme without having to leave their house and the show saves money too. He feels this will continue to be an option on an on-going basis, though the panel notes that technical capability is limited with contributors, even top dollar talent without anyone else there to help. Tyler says that his studio has been more in demand during Covid so his team has become better at tear-downs to accommodate multiple uses. And lastly, the panel makes the point that hybrid programme making models are going to continue.

After some questions from the audience, the panel comments on future strategies. Sean Gardner from Xilinx talks about the need and arrival of newer codecs such as AV1 and LCEVC can help do deliver lower bitrates and/or lower latency. Aaron mentions that he’s seen ways of gamifying the streams which he hasn’t used before which helps with monetising. And Sam leaves us with the thought that game APIs can help create fantastic productions when they’re done well, but he sees an even better future where APIs allow information to be fed back into the game which will be able to create a two-way event between the fans and the game.

Watch now!
Speakers

Jeff Jacobs Moderator:Jeff Jacobs
Executive Vice President & General Manager,
VENN
Aaron Nagler Aaron Nagler
Co-Founder,
Cheesehead TV
Sam Asfahani Sam Asfahani
CEO,
OS Studios
Sean Gardner Sean Gardner
Snr Manager, Market Development & Strategy, Cloud Video,
Xilinx
Tyler Champley Tyler Champley
VP Marketing & Audience Development,
Poker Central

Video: Decentralised Production Tips and Best Practices

Live sports production has seen a massive change during COVID. We looked at how this changed at the MCR recently on The Broadcast Knowledge hearing how Sky Sports had radically changed along with Arsenal TV. This time we look to see how life in the truck has changed. The headline being that most people are staying at home, so how to you keep people at home and mix a multi-camera event?

Ken Kerschbaumer from Sports Video Group talks to VidOvation Jim Jachetta
and James Japhet from Hawk-Eye to understand the role they’ve been playing in bringing live sports to screen where the REMI/Outside Broadcast has been pared down to the minimum and most staff are at home. The conversation starts with the backdrop of The Players Championship, part of the PGA Tour which was produced by 28 operators in the UK who mixes 120+ camera angles and the audio to produce 25 live streams including graphics for broadcasters around the world.

Lip-sync and genlock aren’t optional when it comes to live sports. Jim explains that his equipment can do up to fifty cameras with genlock synchronisation over bonded cellular and this is how The Players worked with a bonded cellular on each camera. Jim discusses how audio, also has to be frame-accurate as they had many, many mics always open going back to the sound mixer at home.

James from Hawk-Eye explained that part of their decision to leave equipment on-site was due to lip-sync concerns. Their system worked differently to VidOvation, allowing people to ‘remote desktop’, using a Hawk-Eye-specifc low-latency technology dedicated to video transport. This also works well for events where there isn’t enough connectivity to support streaming of 10, 20 or 50+ feeds to different locations from the location.

The production has to change to take account of two factors: the chance a camera’s connectivity might go down and latency. It’s important to plan shots ahead of time to account for these factors, outlining what the backup plan is, say going to a wide shot on camera 3, if camera 1 can’t be used. When working with bonded cellular, latency is an unavoidable factor and can be as high as 3 seconds. In this scenario, Jim explains it’s important to explain to the camera operators what you’re looking for in a shot and let them work more autonomously than you might traditionally do.

Latency is also very noticeable for the camera shaders who usually rack cameras with milliseconds of latency. CCU’s are not used to waiting a long time for responses, so a lot of faked messages need to be sent to keep the CCU and controller happy. The shader operator needs to then get used to the latency, which won’t be as high as the video latency and take things a little slower in order to get the job done.

Not travelling everywhere has been received fairly well by freelancers who can now book in more jobs and don’t need to suffer reduced pay for travel days. There are still people travelling to site, Jim says, but usually, people who can drive and then will sit in the control room with shields. For the PGA Tour, the savings are racking up. Whilst there are a lot of other costs/losses at the moment for so many industries, it’s clear that the reduced travel and hosting will continue to be beneficial after restrictions are lifted.

Watch now!
Speakers

Jim Jachetta Jim Jachetta
EVP & CTO: Wireless Video & Cellular Uplinks
VidOvation
James Japhet James Japhet
Managing Director
Hawk-Eye North America
Ken Kerschbaumer Ken Kerschbaumer
Editorial Director,
Sports Video Group