Video: FOX – Uncompressed live sports in the cloud

Is using uncompressed video in the cloud with just 6 frames of latency to get there and back ready for production? WebRTC manages sub-second streaming in one direction and can even deliver AV1 in real-time. The key to getting down to a 100ms round trip is to move down to millisecond encoding and to use uncompressed video in the cloud. This video shows how it can be done.

Fox has a clear direction to move into the cloud and last year joined AWS to explains how they’ve put their delivery distribution into the cloud remuxing feeds for ATSC transmitters, satellite uplinks, cable headends and encoding for internet delivery, In this video, Fox’s Joel Williams, Evan Statton from AWS explain their work together making this a reality. Joel explains that latency is not a very hot topic for distribution as there are many distribution delays. The focus has been on getting the contribution feeds into playout and MCR monitoring quickly. After all, when people are counting down to an ad break, it needs to roll exactly on zero.

Evan explains the approach AWS has taken to solving this latency problem and starts with considering using SMPTE’s ST 2110 in the cloud. ST 2110 has video flows of at least 1 Gbps, typically and when implemented on-premise is typically built on a dedicated network with very strict timing. Cloud datacentres aren’t like that and Evan demonstrates this showing how across 8 video streams, there are video drops of several seconds which is clearly not acceptable. Amazon, however, has a product called ‘Scalable Reliable Datagram’ which is aimed at moving high bitrate data through their cloud. Using a very small retransmission buffer, it’s able to use multiple paths across the network to deliver uncompressed video in real-time. The retransmission buffer here being very small enables just enough healing to redeliver missing packets within the 16.7ms it takes to deliver a frame of 60fps video.

On top of SRD, AWS have introduced CDI, the Cloud Digital Interface, which is able to describe uncompressed video flows in a way already familiar to software developers. This ‘Audio Video Metadata’ layer handles flows in the same way as 2110, for instance keeping essences separate. Evan says this has helped vendors react favourably to this new technology. For them instead of using UDP, SRD can be used with CDI giving them not only normal video data structures but since SRD is implemented in the Nitro network card, packet processing is hidden from the application itself.

The final piece to the puzzle is keeping the journey into and out of the cloud low-latency. This is done using JPEG XS which has an encoding time of a few milliseconds. Rather than using RIST, for instance, to protect this on the way into the cloud, Fox is testing using ST 2022-7. 2022-7 takes in two identical streams on two network interfaces, typically. This way it should end up with two copies of each packet. Where one gets lost, there is still another available. This gives path redundancy which a single stream will never be able to offer. Overall, the test with Fox’s Arizona-based Technology Center is shown in the video to have only 6 frames of latency for the return trip. Assuming they used a California-based AWS data centre, the ping time may have been as low as two frames. This leaves four frames for 2022-7 buffers, XS encoding and uncompressed processing in the cloud.

Watch now!
Speakers

Joel Williams Joel Wiliams
VP of Architecutre & Engineering,
Fox Corporation
Evan Statton Evan Statton
Principal Architect, Media & Entertainment,
AWS

Video: Decentralised Production Tips and Best Practices

Live sports production has seen a massive change during COVID. We looked at how this changed at the MCR recently on The Broadcast Knowledge hearing how Sky Sports had radically changed along with Arsenal TV. This time we look to see how life in the truck has changed. The headline being that most people are staying at home, so how to you keep people at home and mix a multi-camera event?

Ken Kerschbaumer from Sports Video Group talks to VidOvation Jim Jachetta
and James Japhet from Hawk-Eye to understand the role they’ve been playing in bringing live sports to screen where the REMI/Outside Broadcast has been pared down to the minimum and most staff are at home. The conversation starts with the backdrop of The Players Championship, part of the PGA Tour which was produced by 28 operators in the UK who mixes 120+ camera angles and the audio to produce 25 live streams including graphics for broadcasters around the world.

Lip-sync and genlock aren’t optional when it comes to live sports. Jim explains that his equipment can do up to fifty cameras with genlock synchronisation over bonded cellular and this is how The Players worked with a bonded cellular on each camera. Jim discusses how audio, also has to be frame-accurate as they had many, many mics always open going back to the sound mixer at home.

James from Hawk-Eye explained that part of their decision to leave equipment on-site was due to lip-sync concerns. Their system worked differently to VidOvation, allowing people to ‘remote desktop’, using a Hawk-Eye-specifc low-latency technology dedicated to video transport. This also works well for events where there isn’t enough connectivity to support streaming of 10, 20 or 50+ feeds to different locations from the location.

The production has to change to take account of two factors: the chance a camera’s connectivity might go down and latency. It’s important to plan shots ahead of time to account for these factors, outlining what the backup plan is, say going to a wide shot on camera 3, if camera 1 can’t be used. When working with bonded cellular, latency is an unavoidable factor and can be as high as 3 seconds. In this scenario, Jim explains it’s important to explain to the camera operators what you’re looking for in a shot and let them work more autonomously than you might traditionally do.

Latency is also very noticeable for the camera shaders who usually rack cameras with milliseconds of latency. CCU’s are not used to waiting a long time for responses, so a lot of faked messages need to be sent to keep the CCU and controller happy. The shader operator needs to then get used to the latency, which won’t be as high as the video latency and take things a little slower in order to get the job done.

Not travelling everywhere has been received fairly well by freelancers who can now book in more jobs and don’t need to suffer reduced pay for travel days. There are still people travelling to site, Jim says, but usually, people who can drive and then will sit in the control room with shields. For the PGA Tour, the savings are racking up. Whilst there are a lot of other costs/losses at the moment for so many industries, it’s clear that the reduced travel and hosting will continue to be beneficial after restrictions are lifted.

Watch now!
Speakers

Jim Jachetta Jim Jachetta
EVP & CTO: Wireless Video & Cellular Uplinks
VidOvation
James Japhet James Japhet
Managing Director
Hawk-Eye North America
Ken Kerschbaumer Ken Kerschbaumer
Editorial Director,
Sports Video Group

Video: How to build two large Full-IP OB trucks (during COVID-19)

It’s never been easy building a large OB van. Keeping within axel weight, getting enough technology in and working within a tight project timeline, not to mention keeping the expanding sections cool and water-tight is no easy task. Add on that social distancing thanks to SARS-CoV-2 and life gets particularly tricky.

This project was intriguing before Covid-19 because it called for two identical SMPTE ST-2110 IP trucks to be built, explains Geert Thoelen from NEP Belgium. Both are 16-camera trucks with 3 EVS each. The idea being that people could walk into truck A on Saturday and do a show then walk into truck B on Sunday and work in exactly the same show but on a different match. Being identical, when these trucks will be delivered to Belgium public broadcaster RTBF, production crews won’t need to worry about getting a better or worse truck then the other programmes.. The added benefit is that weight is reduced compared to SDI baseband. The trucks come loaded with Sony Cameras, Arista Switches, Lawo audio, EVS replays and Riedel intercoms. It’s ready to take a software upgrade for UHD and offers 32 frame-synched and colour-corrected inputs plus 32 outputs.

Broadcast Solutions have worked with NEP Belgium for many years, an ironically close relationship which became a key asset in this project which had to be completed under social distancing rules. Working open book and having an existing trust between the parties, we hear, was important in completing this project on time. Broadcast Solutions separated internet access for the truck to access the truck as it was being built with 24/7 remote access for vendors.

Axel Kühlem fro broadcast solutions address a question from the audience of the benefits of 2110. He confirms that weight is reduced compared to SDI by about half, comparing like for like equipment. Furthermore, he says the power is reduced. The aim of having two identical trucks is to allow them to be occasionally joined for large events or even connected into RTBF’s studio infrastructure for those times when you just don’t have enough facilities. Geert points out that IP on its own is still more expensive than baseband, but you are paying for the ability to scale in the future. Once you count the flexibility it affords both the productions and the broadcaster, it may well turn out cheaper over its lifetime.

Watch now!
Speakers

Axel Kühlem Axel Kühlem
Senior System Architect
Broadcast Solutions
Geert Thoelen Geert Thoelen
Technical Director,
NEP Belgium

Video: Live production: Delivering a richer viewing experience

How can large sports events keep an increasingly sophisticated audience entertained and fully engaged? The technology of sports coverage has pushed broadcasting forwards for many years and there’s no change. More than ever there is a convergence of technologies both at the event and delivering to the customers which is explored in this video.

First up is Michael Cole, a veteran of live sports coverage, now working for the PGA European Tour and Ryder Cup Europe. As the event organisers – who host 42 golfing events throughout the year – they are responsible for not just the coverage of the golf, but also a whole host of supporting services. Michael explains that they have to deliver live stats and scores to on-air, on-line and on-course screens, produce a whole TV service for the event-goers, deliver an event app and, of course run a TV compound.

One important aspect of golfing coverage is the sheer distances that video needs to cover. Formerly that was done primarily with microwave links and whilst RF still plays an important part of coverage with wireless cameras, the long distances are now done by fibre. However as this takes time to deploy each time and is hard to conceal in otherwise impeccably presented courses, 5G is seeing a lot of interest to validate its ability to cut rigging time and costs along with making the place look tidier in front of the spectators.

Michael also talks about the role of remote production. Many would see this an obvious way to go, but remote production has taken many years to slowly be adopted. Each broadcaster has different needs so getting the right level of technology available to meet everyone’s needs is still a work in progress. For the golfing events with tens of trucks, and cameras, Michael confirms that remote production and cloud is a clear way forward at the right time.

Next to talk is Remo Ziegler from VizRT who talks about how VizRT serves the live sports community. Looking more at the delivery aspect, they allow branding to be delivered to multiple platforms with different aspect ratios whilst maintaining a consistent look. Whilst branding is something that, when done well, isn’t noticed by viewers, more obvious examples are real-time, photo-realistic rendering for in-studio, 3D graphics. Remo talks next about ‘Augmented Reality’, AR, which can be utilised by placing moving 3D objects into a video making them move and look part of the picture as a way of annotating the footage to help explain what’s happening and to tell a story. This can be done in real time with camera tracking technology which takes into account the telemetry from the camera such as angle of tilt and zoom level to render the objects realistically.

The talk finishes with Chris explaining how viewing habits are changing. Whilst we all have a sense that the younger generation watch less live TV, Chris has the stats showing the change from people 66 years+ for whom ‘traditional content’ comprises 82% of their viewing down to 16-18 year olds who only watch 28%, the majority of the remainder being made up from SCOD and ‘YouTube etc.’.

Chris talks about the newer cameras which have improved coverage both by improving the technical ability of ‘lower tier’ productions but also for top-tier content, adding cameras in locations that would otherwise not have been possible. He then shows there is an increase in HDR-capable cameras being purchased which, even when not being used to broadcast HDR, are valued for their ability to capture the best image possible. Finally, Chris rounds back on Remote Production, explaining the motivations of the broadcasters such as reduced cost, improved work-life balance and more environmentally friendly coverage.

The video finishes with questions from the webinar audience.

Watch now!
Speakers

Michael Cole Michael Cole
Chief Technology Officer,
PGA European Tour & Ryder Cup Europe
Remo Ziegler Remo Ziegler
Vice President, Product Management, Sports,
Vizrt
Chris Evans Chris Evans
Senior Market Analyst,
Futuresource Consulting