Video: The Future of Live HDR Production

HDR has long been hailed as the best way to improve the image delivered to viewers because it packs a punch whatever the resolution. Usually combined with a wider colour gamut, it brings brighter highlights, more colours with the ability to be more saturated. Whilst the technology has been in TVs for a long time now, it’s continued to evolve and it turns out doing a full, top tier production in HDR isn’t trivial so broadcasters have been working for a number of years now to understand the best way to deliver HDR material for live sports.

Leader has brought together a panel of people who have all cut their teeth implementing HDR in their own productions and ‘writing the book’ on HDR production. The conversation starts with the feeling that HDR’s ‘there’ now and is now much more routinely than before doing massive shows as well as consistent weekly matches in HDR.
 

 
Pablo Garcia Soriano from CORMORAMA introduces us to light theory talking about our eyes’ non-linear perception of brightness. This leads to a discussion of what ‘Scene referred’ vs ‘Display referred’ HDR means which is a way of saying whether you interpret the video as describing the brightness your display should be generating or the brightness of the light going into the camera. For more on colour theory, check out this detailed video from CVP or this one from SMPTE.

Pablo finishes by explaining that when you have four different deliverables including SDR, Slog-3, HLG and PQ, the only way to make this work, in his opinion, is by using scene-referred video.

Next to present is Prin Boon from PHABRIX who relates his experiences in 2019 working on live football and rugby. These shows had 2160p50 HDR and 1080i25 SDR deliverables for the main BT Programme and the world feed. Plus there were feeds for 3rd parties like the jumbotron, VAR, BT Sport’s studio and the EPL.

2019, Prin explains, was a good year for HDR as TVs and tablets were properly available in the market and behind the scenes, Stedicam now had compatible HDR rigs and radio links could now be 10-bit. Replay servers, as well, ran in 10bit. In order to produce an HDR programme, it’s important to look at all the elements and if only your main stadium cameras are HDR, you soon find that much of the programme is actually SDR originated. It’s vital to get HDR into each camera and replay machine.

Prin found that ‘closed-loop SDR shading’ was the only workable way of working that allowed them to produce a top-quality SDR product which, as Kevin Salvidge reminds us is the one that earns the most money still. Prin explains what this looks like, but in summary, all monitoring is done in SDR even though it’s based on the HDR video.

In terms of tips and tricks, Prin warns about being careful with nomenclature not only in your own operation but also in vendor specified products giving the example of ‘gain’ which can be applied either as a percentage or as dB in either the light or code space, all permutations giving different results. Additionally, he cautions that multiple trips to and from HDR/SDR will lead to quantisation artefacts and should be avoided when not necessary.
 

 
The last presentation is from Chris Seeger and Michael Drazin from NBC Universal talk about the upcoming Tokyo Olympics where they’re taking the view that SDR should look the ‘same’ as HDR. To this end, they’ve done a lot of work creating some LUTs (Look Up Tables) which allow conversion between formats. Created in collaboration with the BBC and other organisations, these LUTs are now being made available to the industry at large.

They use HLG as their interchange format with camera inputs being scene referenced but delivery to the home is display-referenced PQ. They explain that this actually allows them to maintain more than 1000 NITs of HDR detail. Their shaders work with HDR, unlike the UK-based work discussed earlier. NBC found that the HDR and SDR out of the CCU didn’t match so the HDR is converted using the NBC LUTs to SDR. They caution to watch out for the different primaries of BT 709 and BT 2020. Some software doesn’t change the primaries and therefore the colours are shifted.

NBC Universal put a lot of time into creating their own objective visualisation and measurement system to be able to fully analyse the colours of the video as part of their goal to preserve colour intent even going as far as to create their own test card.

The video ends with an extensive Q&A session.

Watch now!
Speakers

Chris Seeger Chris Seeger
Office of the CTO, Director, Advanced Content Production Technology
NBC Universal
Michael Drazin Michael Drazin
Director Production Engineering and Technology,
NBC Olympics
Pablo Garcia Soriano Pablo Garcia Soriano
Colour Supervisor, Managing Director
CROMORAMA
Prinyar Boon Prinyar Boon
Product Manager, SMPTE Fellow
PHABRIX
Ken Kerschbaumer Moderator: Ken Kerschbaumer
Editorial Director,
Sports Video Group
Kevin Salvidge
European Regional Development Manager,
Leader

Video: In Stadium Production Workflow and COVID 19

As COVID-19 has wrought many changes to society, this webinar looks to see how it’s changed the way live sports is produced by the in-stadium crews producing the TV shows the cover the ups and downs of our beloved teams. We know that crews have had to be creative, but what has that actually looked like and has it all been negative?

The SMPTE Philadelphia section first invites Martin Otremsky to talk about how his work within the MLB has changed. Before the beginning of the season, Martin and team weren’t allowed in the stadium and had very little notice of the first game for which they were allowed to prepare.

Crowd noise was a big issue. Not only was the concern that the players would find it offputting in a silent stadium, but it was soon realised that swearing and tactics chat could easily be heard by other players and the mics. Bringing back crowd noise helped mask that and allow the teams to talk normally.

The crowd noise was run off three iPads. One which ran a 3-hour loop of general crowd noise and then two which had touch screen buttons to trigger different types/moods of effects dealing with the noises of anticipation and reactions, when the crowd would be at ease and when they would be ‘pumped’. This was run on the fly by two operators who would keep this going throughout the game. The crowd noise required a fair bit of fine-tuning including getting the in-stadium acoustics right as the speaker system is set up to assume the stands are lined with absorbent people; without a crowd, the acoustics are much more echoey.

Link to video

The Covid protections dealt with people in 3 tiers. Tier 1 was for the players and coaches all of whom were tested every 48 hours and where they would need to be. Tier 3 was for people who could go everywhere not in Tier 1. Tier 2 was one or two people who were allowed in both but under extreme restrictions. As such the crew in Tier 3 found it hard to do a lot of maintenance/configuration in certain places as they had to leave every time Tier 1 staff needed to be in the area.

The operation itself had been pared down from 28 to 9 people which was partly achieved by simplifying the programme itself. The ballpark often had to be flipped to accommodate another team using it as their ‘home’ stadium which caused a lot of work as graphics would have to be reworked to fit the in-stadium graphics boards. Crowd noise would have to cheer for a different team and the video & production graphics had to be swapped out. Matin ends his presentation with a Q&A session.

Next up is Carl Mandell for the Philadelphia Union football/soccer team. They had been lucky enough to be at the end of a stadium technical refresh of their LED displays and their control room, moving up to 1080p60 HDR. Alas, the COVID restrictions hit just before the home opener, robbing them of the chance to use all the new equipment. Their new cameras, for instance, remained unused until right near the end of the season.

Link to video

Unlike the MLB, Carl explains they didn’t choose to do crowd audio in the stadium itself. They ran virtual graphics to fulfil contractual obligations and to brighten up the relatively empty stadium. Unlike MLB, they were able to have a limited crowd in the stadium during most matches.

For the crowd noise used in the broadcast programme, they used audio taken from their ten-year archive of previous games and allowed the chants and reactions to be under the control of fans.

One COVID success was moving the press conferences on Zoom. Whilst the spokesperson/people were actually in the stadium with a PTZ camera in front of them and an 85″ TV to the side, all the media were brought in on Zoom. Carl says they found that this format produced much more engaging conferences which increased their international appeal and raised the viewership of the press conferences.

Watch now!
Speakers

Carl Mandell Carl Mandell
Director, Broadcasting & Video Production
Philadelphia Union
Martin Otremsky Martin Otremsky
Director of Video Engineering,
Philadelphia Phillies

Video: IP For Broadcast, Colour Theory, AI, VR, Remote Broadcast & More


Today’s video has a wide array of salient topics from seven speakers at SMPTE Toronto’s meeting in February. Covering Uncompressed IP networking, colour theory & practice, real-time virtual studios and AI, those of us outside of Toronto can be thankful it was recorded.

Ryan Morris from Arista (starting 22m 20s) is the first guest speaker and kicks off with though-provoker: showing the uncompressed bandwidths of video, we see that even 8K video at 43Gb/s is much lower than the high-end network bandwidths available in 400Gbps switch ports available today with 800Gbps arriving within a couple of years. That being said, he gives us an introduction to two of the fundamental technologies enabling the uncompressed IP video production: Multicast and Software-Defined Networking (SDN).

Multicast, Ryan explains is the system of efficiently distributing data from one source to many receivers. It allows a sender to only send out one stream even if there are a thousand receivers on the network; the network will split the feed at the nearest common point to the decoder. This is all worked out using the Internet Group Message Protocol (IGMP) which is commonly found in two versions, 2 and 3. IGMP enables routers to find out which devices are interested in which senders and allows devices to register their interest. This is all expressed by the notion of joining or leaving a multicast group. Each multicast group is assigned an IP address reserved by international agreement for this purpose, for instance, 239.100.200.1 is one such address.

Ryan then explores some of the pros and cons of IGMP. Like most network protocols each element of the network makes its own decision based on standardised rules. Though this works well for autonomy, it means that there no knowledge of the whole system. It can’t take notice of link capacity, it doesn’t know the source bandwidth, you can guess where media will flow, but it’s not deterministic. Broadcasters need more assurance of traffic flows for proper capacity planning, planned maintenance and post-incident root cause analysis.

Reasons to consider SDN over IGMP

SDN is an answer to this problem. Replacing much of IGMP, SDN takes this micro-decision making away from the switch architecture and replaces it with decisions made looking at the whole picture. It also brings an in important abstraction layer back to broadcast networks; engineers are used to seeing X-Y panels and, in an emergency, it’s this simplicity which gets things back on air quickly and effectively. With SDN doing the thinking, it’s a lot more practical to program a panel with human names like ‘Camera 1’ and allow a take button to connect it to a destination.

Next is Peter Armstrong from THP who talks about colour in television (starting 40m 40s). Starting back with NTSC, Peter shows the different colour spaces available from analogue through to SD then HD with Rec 709 and now to 3 newer spaces. For archiving, there is an XYZ colour space for archival which can represent any colour humans can see. For digital cinema there is DCI-P3 and with UHD comes BT 2020. These latter colour spaces provide for display of many more colours adding to the idea of ‘better pixels’ – improving images through improving the pixels rather than just adding more.

Another ‘better pixels’ idea is HDR. Whilst BT 2020 is about Wide Colour Gamut (WCG), HDR increases the dynamic range so that the brightness of each pixel can represent a brightness between 0 and 1000 NITs, say instead of the current standard of 0 to 100. Peter outlines the HLG and PQ standards for delivering HDR. If you’re interested in a deeper dive, check out our library of articles and videos such as this talk from Amazon Prime Video. or this from SARNOFF’s Norm Hurst.

ScreenAlign device from DSC Labs

SMPTE fellow and founder of DSC Laboratories, David Corley (56m 50s), continues the colour theme taking us on an enjoyable history of colour charting over the past 60 years up to the modern day. David explains how he created a colour chart in the beginning when labs were struggling to get colours correct for their non-black and white film stock. We see how that has developed over the years being standardised in SMPTE. Recently, he explains, they have a new test card for digital workflows where the camera shoots a special test card which you also have in a digital format. In your editing suite, if you overlay that file on the video, you can colour correct the video to match. Furthermore, DSC have developed a physical overlay for your monitor which self-illuminates meaning when you put it in front of your monitor, you can adjust the colour of the display to match what you see on the chart in front.

Gloria Lee (78m 8s) works for Graymeta, a company whose products are based on AI and machine learning. She sets the scene explaining how broadly our lives are already supported by AI but in broadcast highlights the benefits as automating repetitive tasks, increasing monetisation possibilities, allowing real-time facial recognition and creating additional marketing opportunities. Gloria concludes giving examples of each.

Cliff Lavalée talks about ‘content creation with gaming tools’ (91m 10s) explaining the virtual studio they have created at Groupe Média TFO. He explains the cameras the tracking and telemetry (zoom etc.) needed to ensure that 3 cameras can be moved around in real-time allowing the graphics to follow with the correct perspective shifts. Cliff talks about the pros and cons of the space. With hardware limiting the software capabilities and the need for everything to stick to 60fps, he finds that the benefits which include cost, design freedom and real-time rendering create an over-all positive. This section finishes with a talk from one of the 3D interactive set designers who talks us through the work he’s done in the studio.

Mary Ellen Carlyle concludes the evening talking about remote production and esports. She sets the scene pointing to a ‘shifting landscape’ with people moving away from linear TV to online streaming. Mary discusses the streaming market as a whole talking about Disney+ and other competitors currently jostling for position. Re-prising Gloria’s position on AI, Mary next looks further into the future for AI floating the idea of AI directing of football matches, creating highlights packages, generating stats about the game, spotting ad insertion opportunities and more.

Famously, Netlflix has said that Fortnite is one of its main competitors. And indeed, esports is a major industry unto itself so whether watching or playing games, there is plenty of opportunity to displace Netflix. Deloitte Insights claim 40% of gamers watch esports events at least once a week and in terms of media rights, these are already in the 10s and 100s of millions and are likely to continue to grow. Mary concludes by looking at the sports rights changing hands over the next few years. The thrust being that there are several high profile rights auctions coming up and there is likely to be fervent competition which will increase prices. Some are likely to be taken, at least in part, by tech giants. We have already seen Amazon acquiring rights to some major sports rights.

Watch now!
Speakers

Ryan Morris Ryan Morris
Systems Engineer,
Arista
Gloria Lee Gloria Lee
VP, Business Development
Graymeta Inc.
Mary Ellen Carlyle Mary Ellen Carlyle
SVP & General Manager,
Dome Productions
Cliff Lavalée Cliff Lavalée
Director of LUV studio services,
Groupe Média TFO
Peter Armstrong Peter Armstrong
Video Production & Post Production Manager,
THP
David Corley David Corley
Presiedent,
DSC Labs

Video: How to build two large Full-IP OB trucks (during COVID-19)

It’s never been easy building a large OB van. Keeping within axel weight, getting enough technology in and working within a tight project timeline, not to mention keeping the expanding sections cool and water-tight is no easy task. Add on that social distancing thanks to SARS-CoV-2 and life gets particularly tricky.

This project was intriguing before Covid-19 because it called for two identical SMPTE ST-2110 IP trucks to be built, explains Geert Thoelen from NEP Belgium. Both are 16-camera trucks with 3 EVS each. The idea being that people could walk into truck A on Saturday and do a show then walk into truck B on Sunday and work in exactly the same show but on a different match. Being identical, when these trucks will be delivered to Belgium public broadcaster RTBF, production crews won’t need to worry about getting a better or worse truck then the other programmes.. The added benefit is that weight is reduced compared to SDI baseband. The trucks come loaded with Sony Cameras, Arista Switches, Lawo audio, EVS replays and Riedel intercoms. It’s ready to take a software upgrade for UHD and offers 32 frame-synched and colour-corrected inputs plus 32 outputs.

Broadcast Solutions have worked with NEP Belgium for many years, an ironically close relationship which became a key asset in this project which had to be completed under social distancing rules. Working open book and having an existing trust between the parties, we hear, was important in completing this project on time. Broadcast Solutions separated internet access for the truck to access the truck as it was being built with 24/7 remote access for vendors.

Axel Kühlem fro broadcast solutions address a question from the audience of the benefits of 2110. He confirms that weight is reduced compared to SDI by about half, comparing like for like equipment. Furthermore, he says the power is reduced. The aim of having two identical trucks is to allow them to be occasionally joined for large events or even connected into RTBF’s studio infrastructure for those times when you just don’t have enough facilities. Geert points out that IP on its own is still more expensive than baseband, but you are paying for the ability to scale in the future. Once you count the flexibility it affords both the productions and the broadcaster, it may well turn out cheaper over its lifetime.

Watch now!
Speakers

Axel Kühlem Axel Kühlem
Senior System Architect
Broadcast Solutions
Geert Thoelen Geert Thoelen
Technical Director,
NEP Belgium