Video: Football Production Technology: The Verdict


Football coverage of the main game is always advancing, but this year there have been big changes in production as well as the continued drive to bring second screens mainstream. This conversation covers the state of the art of football production bringing together Mark Dennis of Sunset+Vine, Emili Planas from Mediapro and Tim Achberger from Sportcast in a conversation moderated by Sky Germany’s Alessandro Reitano for the SVG Europ Football Summit 2021.

The first topic discussed is the use of automation to drive highlights packages. Mark from S+V feels that for the tier 1 shows they do, human curation is still better but recognises that the creation of secondary and tertiary video from the event could benefit from AI packages. In fact, Mediapro is doing just this providing a file-based clips package while the match is ongoing. This helps broadcasters use clips quicker and also avoids post-match linear playouts. Tim suggests that AI has a role to play when dealing with 26 cameras and orchestrating the inputs and outputs of social media clips as well as providing specialised feeds. Sportcast are also using file delivery to facilitate secondary video streams during the match.

 

 

Answering the question “What’s missing from the industry?”, Mark asks if they can get more data and then asks how can they show all data. His point is that there are still many opportunities to use data, like BT Sport’s current ability to show the speed of players. He feels this works best on the second screen, but also sees a place for increasing data available to fans in the stadium. Emili wants better data-driven content creation tools and ways to identify which data is relevant. Time agrees that data is important and, in common with Emili, says that the data feeds provide the basis of a lot of the AI workflows’ ability to classify and understand clips. He sees this as an important part of filtering through the 26 cameras to find the ones people actually want to see.

Alessandro explains he feels that focus is moving from the main 90 minutes to the surrounding storylines. Not in a way that detracts from the main game, but in a way that shows production is taking seriously the pre and post stories and harnessing technology to exploit the many avenues available to tell the stories and show footage that otherwise would have space to be seen.

The discussion turns to drones and other special camera systems asking how they fit in. Tim says that dromes have been seen as a good way to differentiate your product and without Covid restrictions, could be further exploited. Tim feels that special cameras should be used more in post and secondary footage wondering if there could be two world feeds, one which has a more traditional ‘Camera 1’ approach and another which much more progressively includes a lot of newer camera types. Emili follows on by talking bout Mediapro’s ‘Cinecam’ which uses a Sony Venice camera to switch between normal Steadicam footage during the match to a shallow depth-of-field DSLR style post-match which give the celebrations a different, more cinematic look with the focus leading the viewer to the action.

The panel finishes by discussing the role of 5G. Emili sees it as a benefit to production and a way to increase consumer viewing time. He sees opportunities for 5G to replace satellite and help move production into the cloud for tier 2 and 3 sports. Viewers at home may be able to watch matches in better quality and in stadiums the plans are to offer data-enriched services to fans so the can analyse what’s going on and have a better experience than at home. Mark at S+V sees network slicing as the key technology giving production the confidence that they will have the bandwidth they need on the day. 5G will reduce costs and he’s hoping he may be able to enhance remote production for staff at home whose internet isn’t great quality bringing more control and assuredness into their connectivity.

Watch now!
Speakers

Tim Achberger Tim Achberge
Sportcast,
Head of Innovation & Technology
Emili Planas Emili Planas
CTO and Operations Manager
Mediapro
Mark Dennis Mark Dennis
Director of Technical Operations
Sunset+Vine
Alessandro Reitano Moderator: Alessandro Reitano
SVP of Sports Production,
Sky Germany

Video Case Study: How BT Sport de-centralised its football production

We’ve all changed the way we work during the pandemic, some more than others. There’s nothing better than a real-life case study to learn from and to put your own experience into perspective. In this video, BT Sport and their technology provider Timeline TV take us through what they have and haven’t done to adapt.

Jamie Hindhaugh, COO of BT Sport explains that they didn’t see working at home as simply a decentralisation, but rather a centralisation of the technology to be used by a decentralised body of staff. This concept is similar to Discovery’s recent Eurosport IP transformation project which has all participating countries working from equipment in two datacentres. BT Sport managed to move from a model of two to three hundred people in the office daily to producing a live football talk show from presenters’ homes, broadcast staff also at home, in only 10 days. The workflow continued to be improved over the following 6 weeks at which point they felt they had migrated to an effective ‘at home’ workflow.

 

 

Speaking to the challenges, Dan McDonnell CEO of Timeline TV said that basic acquisition and distribution of equipment like laptops was tricky since everyone else was doing this, too. But once the equipment was in staff homes, they soon found out the problems moving out of a generator-backed broadcast facility. UPSes were distributed to those that needed them but Dan notes there was nothing they could do to help with the distraction of working with your children and/or pets.

Jamie comments that connectivity is very important and they are moving forward with a strategy called ‘working smart’ which is about giving the right tools to the right people. It’s about ensuring people are connected wherever they are and with BT Sport’s hubs around the country, they are actively looking to provide for a more diverse workforce.

BT Sport has a long history of using remote production, Dan points out which has driven BT Sport’s recent decision to move to IP in Stratford. Premiership games have changed from being a main and backup feed to needing 20 cameras coming into the building. This density of circuits in both HD and UHD has made SDI less and less practical. Jamie highlights the importance of their remote production heritage but adds that the pandemic meant remote production went way beyond normal remote productions now that scheduling and media workflows also has to be remote which would always have stayed in the building normally.

Dan says that the perspective has changed from seeing production as either a ‘studio’ or ‘remote OB’ production to allowing either type of production to pick and choose the best combination of on-site roles and remote roles. Dan quips that they’ve been forced to ‘try them all’ and so have a good sense of which work well and which benefit from on-site team working.

Watch now!
Speakers

Dan McDonnell Dan McDonnell
CEO,
Timeline TV
Jamie Hindhaugh Jamie Hindhaugh
COO,
BT Sport
Heather McLean Moderator: Heather McLean
Editor,
SVG Europe

Video: Decentralised Production Tips and Best Practices

Live sports production has seen a massive change during COVID. We looked at how this changed at the MCR recently on The Broadcast Knowledge hearing how Sky Sports had radically changed along with Arsenal TV. This time we look to see how life in the truck has changed. The headline being that most people are staying at home, so how to you keep people at home and mix a multi-camera event?

Ken Kerschbaumer from Sports Video Group talks to VidOvation Jim Jachetta
and James Japhet from Hawk-Eye to understand the role they’ve been playing in bringing live sports to screen where the REMI/Outside Broadcast has been pared down to the minimum and most staff are at home. The conversation starts with the backdrop of The Players Championship, part of the PGA Tour which was produced by 28 operators in the UK who mixes 120+ camera angles and the audio to produce 25 live streams including graphics for broadcasters around the world.

Lip-sync and genlock aren’t optional when it comes to live sports. Jim explains that his equipment can do up to fifty cameras with genlock synchronisation over bonded cellular and this is how The Players worked with a bonded cellular on each camera. Jim discusses how audio, also has to be frame-accurate as they had many, many mics always open going back to the sound mixer at home.

James from Hawk-Eye explained that part of their decision to leave equipment on-site was due to lip-sync concerns. Their system worked differently to VidOvation, allowing people to ‘remote desktop’, using a Hawk-Eye-specifc low-latency technology dedicated to video transport. This also works well for events where there isn’t enough connectivity to support streaming of 10, 20 or 50+ feeds to different locations from the location.

The production has to change to take account of two factors: the chance a camera’s connectivity might go down and latency. It’s important to plan shots ahead of time to account for these factors, outlining what the backup plan is, say going to a wide shot on camera 3, if camera 1 can’t be used. When working with bonded cellular, latency is an unavoidable factor and can be as high as 3 seconds. In this scenario, Jim explains it’s important to explain to the camera operators what you’re looking for in a shot and let them work more autonomously than you might traditionally do.

Latency is also very noticeable for the camera shaders who usually rack cameras with milliseconds of latency. CCU’s are not used to waiting a long time for responses, so a lot of faked messages need to be sent to keep the CCU and controller happy. The shader operator needs to then get used to the latency, which won’t be as high as the video latency and take things a little slower in order to get the job done.

Not travelling everywhere has been received fairly well by freelancers who can now book in more jobs and don’t need to suffer reduced pay for travel days. There are still people travelling to site, Jim says, but usually, people who can drive and then will sit in the control room with shields. For the PGA Tour, the savings are racking up. Whilst there are a lot of other costs/losses at the moment for so many industries, it’s clear that the reduced travel and hosting will continue to be beneficial after restrictions are lifted.

Watch now!
Speakers

Jim Jachetta Jim Jachetta
EVP & CTO: Wireless Video & Cellular Uplinks
VidOvation
James Japhet James Japhet
Managing Director
Hawk-Eye North America
Ken Kerschbaumer Ken Kerschbaumer
Editorial Director,
Sports Video Group

Video: Next-generation audio in the European market – The state of play

Next-generation audio refers to a range of new technologies which allow for immersive audio like 3D sound, for increased accessibility, for better personalisation and anything which delivers a step-change in the lister experience. NGA technologies can stand on their own but are often part of next-generation broadcast technologies like ATSC 3.0 or UHD/8K transmissions.

This talk from the Sports Video Group and Dolby presents one case study from a few that have happened in 2020 which delivered NGA over the air to homes. First, though, Dolby’s Jason Power brings us up to date on how NGA has been deployed to date and looks at what it is.

Whilst ‘3D sound’ is an easy to understand feature, ‘increased personalisation’ is less so. Jason introduces ideas for personalisation such as choosing which team you’re interested in and getting a different crowd mix dependant on that. The possibilities are vast and we’re only just starting to experiment with what’s possible and determine what people actually want or to change where your mics are, on the pitch or in the stands.

What can I do if I want to hear next-generation audio? Jason explains that four out of five TVs are now shipping with NGA audio and all of the five top manufacturers have support for at least one NGA technology. Such technologies are Dolby’s AC-4 and sADM. AC-4 allows delivery of Dolby Atmos which is an object-based audio format which allows the receiver much more freedom to render the sound correctly based on the current speaker set up. Should you change how many speakers you have, the decoder can render the sound differently to ensure the ‘stereo’ image remains correct.

To find out more about the technologies behind NGA, take a look at this talk from the Telos Alliance.

Next, Matthieu Parmentier talks about the Roland Garros event in 2020 which was delivered using sADM plus Dolby AC-4. sADM is an open specification for metadata interchange, the aim of which is to help interoperability between vendors. The S-ADM metadata is embedded in the SDI and then transported uncompressed as SMPTE 302M.

ATEME’s Mickaël Raulet completes the picture by explaining their approach which included setting up a full end-to-end system for testing and diagnosis. The event itself, we see, had three transmission paths. An SDR satellite backup and two feeds into the DVB-T2 transmitter at the Eiffel Tower.

The session ends with an extensive Q&A session where they discuss the challenges they faced and how they overcame them as well as how their businesses are changing.

Watch now!
Speakers

Jason Power Jason Power
Senior Director of Commercial Partnerships & Standards,
Dolby
Mickaël Raulet Mickaël Raulet
Vice President of Innovation,
ATEME
Matthieu Parmentier Matthieu Parmentier
Head of Data & Artificial Intelligence
France Television
Roger Charlesworth Moderator:Roger Charlesworth
Charlesworth Media