Video: The Future of Live HDR Production

HDR has long been hailed as the best way to improve the image delivered to viewers because it packs a punch whatever the resolution. Usually combined with a wider colour gamut, it brings brighter highlights, more colours with the ability to be more saturated. Whilst the technology has been in TVs for a long time now, it’s continued to evolve and it turns out doing a full, top tier production in HDR isn’t trivial so broadcasters have been working for a number of years now to understand the best way to deliver HDR material for live sports.

Leader has brought together a panel of people who have all cut their teeth implementing HDR in their own productions and ‘writing the book’ on HDR production. The conversation starts with the feeling that HDR’s ‘there’ now and is now much more routinely than before doing massive shows as well as consistent weekly matches in HDR.
 

 
Pablo Garcia Soriano from CORMORAMA introduces us to light theory talking about our eyes’ non-linear perception of brightness. This leads to a discussion of what ‘Scene referred’ vs ‘Display referred’ HDR means which is a way of saying whether you interpret the video as describing the brightness your display should be generating or the brightness of the light going into the camera. For more on colour theory, check out this detailed video from CVP or this one from SMPTE.

Pablo finishes by explaining that when you have four different deliverables including SDR, Slog-3, HLG and PQ, the only way to make this work, in his opinion, is by using scene-referred video.

Next to present is Prin Boon from PHABRIX who relates his experiences in 2019 working on live football and rugby. These shows had 2160p50 HDR and 1080i25 SDR deliverables for the main BT Programme and the world feed. Plus there were feeds for 3rd parties like the jumbotron, VAR, BT Sport’s studio and the EPL.

2019, Prin explains, was a good year for HDR as TVs and tablets were properly available in the market and behind the scenes, Stedicam now had compatible HDR rigs and radio links could now be 10-bit. Replay servers, as well, ran in 10bit. In order to produce an HDR programme, it’s important to look at all the elements and if only your main stadium cameras are HDR, you soon find that much of the programme is actually SDR originated. It’s vital to get HDR into each camera and replay machine.

Prin found that ‘closed-loop SDR shading’ was the only workable way of working that allowed them to produce a top-quality SDR product which, as Kevin Salvidge reminds us is the one that earns the most money still. Prin explains what this looks like, but in summary, all monitoring is done in SDR even though it’s based on the HDR video.

In terms of tips and tricks, Prin warns about being careful with nomenclature not only in your own operation but also in vendor specified products giving the example of ‘gain’ which can be applied either as a percentage or as dB in either the light or code space, all permutations giving different results. Additionally, he cautions that multiple trips to and from HDR/SDR will lead to quantisation artefacts and should be avoided when not necessary.
 

 
The last presentation is from Chris Seeger and Michael Drazin from NBC Universal talk about the upcoming Tokyo Olympics where they’re taking the view that SDR should look the ‘same’ as HDR. To this end, they’ve done a lot of work creating some LUTs (Look Up Tables) which allow conversion between formats. Created in collaboration with the BBC and other organisations, these LUTs are now being made available to the industry at large.

They use HLG as their interchange format with camera inputs being scene referenced but delivery to the home is display-referenced PQ. They explain that this actually allows them to maintain more than 1000 NITs of HDR detail. Their shaders work with HDR, unlike the UK-based work discussed earlier. NBC found that the HDR and SDR out of the CCU didn’t match so the HDR is converted using the NBC LUTs to SDR. They caution to watch out for the different primaries of BT 709 and BT 2020. Some software doesn’t change the primaries and therefore the colours are shifted.

NBC Universal put a lot of time into creating their own objective visualisation and measurement system to be able to fully analyse the colours of the video as part of their goal to preserve colour intent even going as far as to create their own test card.

The video ends with an extensive Q&A session.

Watch now!
Speakers

Chris Seeger Chris Seeger
Office of the CTO, Director, Advanced Content Production Technology
NBC Universal
Michael Drazin Michael Drazin
Director Production Engineering and Technology,
NBC Olympics
Pablo Garcia Soriano Pablo Garcia Soriano
Colour Supervisor, Managing Director
CROMORAMA
Prinyar Boon Prinyar Boon
Product Manager, SMPTE Fellow
PHABRIX
Ken Kerschbaumer Moderator: Ken Kerschbaumer
Editorial Director,
Sports Video Group
Kevin Salvidge
European Regional Development Manager,
Leader

Video: ST 2110 The Future of Live Remote Production

Trying to apply the SMPTE ST 2110 hype to the reality of your equipment? This video is here to help. There are many ‘benefits’ of IP which are banded about yet it’s almost impossible to realise them all in one company. For the early adopters, there’s usually one benefit that has been the deal-breaker with other benefits helping boost confidence. Smaller broadcast companies, however, can struggle to get the scale needed for cost savings, don’t require as much flexibility and can’t justify the scalability. But as switches get cheaper and ST 2110 support continues to mature, it’s clear that we’re beyond the early adopter phase.

This panel gives context to ST 2110 and advises on ways to ‘get started’ and skill up. Moderated by Ken Kerschbaumer from the Sports Video Group, Leader’s Steve Holmes, Prinyar Boon from Phabrix join the panel with Arista colleagues Gerard Phillips and Robert Welch and Bridge Technologies’ Chairman Simen Frostad.

The panel quickly starts giving advice. Under the mantra ‘no packet left behind’, Gerard explains that, to him, COTS (Commercial Off The Shelf) means a move to enterprise-grade switches ‘if you want to sleep at night’. Compared to SDI, the move to IT can bring cost savings but don’t skimp on your switch infrastructure if you want a good quality product. Simen was pleased to welcome 2110 as he appreciated the almost instant transmission that analogue gave. The move to digital added a lot of latency, even in the SDI portions of the chain thanks to frame syncs. ST 2110, he says, allows us to get back, most of the way, to no-latency production. He’s also pleased to bid good-bye to embedded data.

It is possible to start small, is the reassuring message next from the panel. The trick here is to start with an island of 2110 and do your learning there. Prinyar lifts up a tote bag saying he has a 2110 system he can fit in there which takes just 10 minutes to get up and running. With two switches, a couple of PTP grandmasters and some 2110 sources, you have what you need to start a small system. There is free software that can help you learn about it, Easy NMOS is a quick-to-deploy NMOS repository that will give you the basics to get your system up and running. You can test NMOS APIs for free with AMWA’s testing tool. The EBU’s LIST project is a suite of software tools that help to inspect, measure and visualize the state of IP-based networks and the high-bitrate media traffic they carry and there’s is also SDPoker which lets you test ST 2110 SDP files. So whilst there are some upfront costs, to get the learning, experience and understanding you need to make decisions on your ST 2110 trajectory, it’s cost-effective and can form part of your staging/test system should you decide to proceed with a project.

The key here is to find your island project. For larger broadcasters or OB companies, a great island is to build an IP OB truck. IP has some big benefits for OB Trucks as we heard in this webinar, such as weight reduction, integration with remote production workflows and scalability to ‘any size’ of event. Few other ‘islands’ are able to benefit in so many ways, but a new self-op studio or small control room may be just the project for learning how to design, install, troubleshoot and maintain a 2110 system. Prinyar cautions that 2110 shouldn’t be just about moving an SDI workflow into IP. The justification should be about improving workflows.

Remote control is big motivator for the move to ST 2110. Far before the pandemic, Discovery chose 2110 for their Eurosport production infrastructure allowing them to centralise into two European locations all equipment controlled in production centres in countries around Europe. During the pandemic, we’ve seen the ability to create new connections without having to physically install new SDI is incredibly useful. Off the back of remote control of resources, some companies are finding they are able to use operators from locations where the hourly rate is low.

Before a Q&A, the panel addresses training. From one quarter we hear that ensuring your home networking knowledge is sound (DHCP, basic IP address details) is a great start and that you can get across the knowledge needed very little time. Prinyar says that he took advantage of a SMPTE Virtual Classroom course teaching the CCNA, whilst Robert from Arista says that there’s a lot in the CCNA that’s not very relevant. The Q&A covers 2110 over WAN, security, hardware life cycles and the reducing carbon footprint of production.

Watch now!
Speakers

Steve Holmes Steve Holmes
Applications Engineer,
Leader
Prinyar Boon Prinyar Boon
Product Manager,
PHABRIX
Gerard Phillips Gerard Phillips
Systems Engineer,
Arista
Simen Frostad Simen Frostad
Chairman,
Bridge Technologies
Robert Welch Robert Welch
Technical Solutions Lead,
Arista
Ken Kerschbaumer Moderator: Ken Kerschbaumer
Chair & Editorial Directo,
Sports Video Group

Video: Winner takes all: Unlocking the opportunity in video games and esports.

Even without the pandemic, esports was set to continue its growth over 2020. By the end of 2020, esports had had quite a boost while other sports were canceled. And whilst esports is a large market, it’s still often misunderstood by those unfamiliar with it. This panel recently looked at not only how Covid had changed esports but also how traditional broadcasters can engage with this popular entertainment segment.

The session starts with an overview of the Asian esports market with Daniel Ahmad from Niko Partners. In 2019 there were 1.3 billion gamers in the whole market. In China, there were 321 million PC gamers who spent around $14.6 billion, plus a mobile gaming population which, by 2024, will have doubled their spending to $32 billion across 737 million gamers.

With esports clearly on the rise, the Sports Video Group’s Jason Dachman has brought some of the key players in esports together, Anna Lockwood from Telstra, Steven Jalicy from ESL, David Harris from Guinevere Capital and Yash Patel from Telstra Ventures. Straight off the bat, they tackle the misconceptions that mainstream media has regarding esports. Steven from ESL says people are quick to dismiss the need for quality in esports. In some ways, the quality needs, he says, are more demanding. David Harris says that people overstate esports’ size today and underestimate how big it will be in the future. Anna Lockwood on the other hand sees that people don’t realise how different and powerful the stories told in esports are.
 

 
Asked to talk about how Covid changed ESL’s plans in 2020, he explained that at the final count, they had actually done more events than last year. ESL had already switched to remote working for much of the technical roles in 2018, at the time seen as quite a forward-thinking idea. Covid forced the rest of the workflows to change as stadium appearances were canceled and gamers competed remotely. Fortunately, the nature of esports makes it relatively easy to move the players. Post-Covid, Steven says that arenas will be back as they are very popular and an obvious focus for tournaments. Seeing players in the flesh is an important part of being a fan. But much of the technical changes, are likely to stay at least in part.

Jason Cacheman asks the panel why esports on linear TV hasn’t been very successful. Many of the panelists agree that the core fans simply aren’t that interested in watching on linear TV as they already have a set up to watch streamed which suits them, often, much better. After a question from the audience, their suggestions for incorporating linear TV into esports is to acknowledge that you’re talking to a group of people who are interested but really don’t know, possibly, anything at all. Linear TV is a great place for documentaries and magazine shows which can educate the audience about the different aspects of esports and help them relate. For instance, a FIFA or NBA esports tournament is easier to understand than a Magic: The Gathering or League of Legends tournament. Linear TV can also spend time focussing on the many stories that are involved in esports both in-game and out. Lastly, esports can be a conduit for traditional broadcasters to bring people onto their digital offerings. As an example, the BBC have an online-only channel, BBC Three. By linking esports content on both BBC Two and BBC Three, they can get interested viewers of their broadcast channel to take an interest in their online channel and also have the potential to appeal to core esports fans using their digital-only channel.

Other questions from the audience included the panel’s opinion on VR in esports, use of AI, how to start working in esports, whether it’s easier to bring esports engineers into broadcast or the other way round. The session finished with a look ahead to the rest of 2021. The thoughts included the introduction of bargaining agreements, salary caps, more APIs for data exchange, and that what we saw in 2020 was a knee-jerk reaction to a new problem; 2021 will see real innovation around staying remote and improving streams for producers and, most importantly, the fans.

Watch now!
Speakers

David Harris David Harris
Managing Director,
Guinevere Capital
Steven Jalicy Steven Jalicy
Global Head of Streaming,
ESL Gaming
Anna Lockwood Anna Lockwood
Head of Global Sales,
Telstra Broadcast Services
Yash Patel Yash Patel
General Partner,
Telstra Ventures
Jason Dachman Moderator: Jason Dachman
Chief Editor,
Sports Video Group

Video: Decentralised Production Tips and Best Practices

Live sports production has seen a massive change during COVID. We looked at how this changed at the MCR recently on The Broadcast Knowledge hearing how Sky Sports had radically changed along with Arsenal TV. This time we look to see how life in the truck has changed. The headline being that most people are staying at home, so how to you keep people at home and mix a multi-camera event?

Ken Kerschbaumer from Sports Video Group talks to VidOvation Jim Jachetta
and James Japhet from Hawk-Eye to understand the role they’ve been playing in bringing live sports to screen where the REMI/Outside Broadcast has been pared down to the minimum and most staff are at home. The conversation starts with the backdrop of The Players Championship, part of the PGA Tour which was produced by 28 operators in the UK who mixes 120+ camera angles and the audio to produce 25 live streams including graphics for broadcasters around the world.

Lip-sync and genlock aren’t optional when it comes to live sports. Jim explains that his equipment can do up to fifty cameras with genlock synchronisation over bonded cellular and this is how The Players worked with a bonded cellular on each camera. Jim discusses how audio, also has to be frame-accurate as they had many, many mics always open going back to the sound mixer at home.

James from Hawk-Eye explained that part of their decision to leave equipment on-site was due to lip-sync concerns. Their system worked differently to VidOvation, allowing people to ‘remote desktop’, using a Hawk-Eye-specifc low-latency technology dedicated to video transport. This also works well for events where there isn’t enough connectivity to support streaming of 10, 20 or 50+ feeds to different locations from the location.

The production has to change to take account of two factors: the chance a camera’s connectivity might go down and latency. It’s important to plan shots ahead of time to account for these factors, outlining what the backup plan is, say going to a wide shot on camera 3, if camera 1 can’t be used. When working with bonded cellular, latency is an unavoidable factor and can be as high as 3 seconds. In this scenario, Jim explains it’s important to explain to the camera operators what you’re looking for in a shot and let them work more autonomously than you might traditionally do.

Latency is also very noticeable for the camera shaders who usually rack cameras with milliseconds of latency. CCU’s are not used to waiting a long time for responses, so a lot of faked messages need to be sent to keep the CCU and controller happy. The shader operator needs to then get used to the latency, which won’t be as high as the video latency and take things a little slower in order to get the job done.

Not travelling everywhere has been received fairly well by freelancers who can now book in more jobs and don’t need to suffer reduced pay for travel days. There are still people travelling to site, Jim says, but usually, people who can drive and then will sit in the control room with shields. For the PGA Tour, the savings are racking up. Whilst there are a lot of other costs/losses at the moment for so many industries, it’s clear that the reduced travel and hosting will continue to be beneficial after restrictions are lifted.

Watch now!
Speakers

Jim Jachetta Jim Jachetta
EVP & CTO: Wireless Video & Cellular Uplinks
VidOvation
James Japhet James Japhet
Managing Director
Hawk-Eye North America
Ken Kerschbaumer Ken Kerschbaumer
Editorial Director,
Sports Video Group