Webinar: Crafting quality: Skills for successful UHD and HDR production

Webinar date: Thursday May 30th 2019
Time: 16:00 BST / 11 am EST / 8 am PDT

Experienced advice is on hand in this webinar for those producing in HDR and UHD. Productions are always trying to raise the quality of acquisition in order to deliver better quality to the viewers, to enhance creative possibilities and to maximise financial gain by future proofing their archives. But this push always brings challenges in production and the move to UHD and HDR is no different.

HDR and UHD are not synonymous, but often do go hand-in-hand. This is partly because the move to UHD is a move to improve quality, but time and again we hear the reasons that increasing resolution in and of itself is not always an improvement. Rather the ‘better pixels’ mantra seeks to improve quality through improving the video using a combination of resolution, frame-rate, HDR and Wide Colour Gamut (WCG). So when it’s possible, HDR and WCG are often combined with UHD.

In this webinar, we hear the challenges on the way to success met by director and producer Pamela Ann Berry and The Farm Group. Register to hear them share their tips and tricks for better UHD and HDR production.

Register now!

Speakers

Pamela Ann Berry Pamela Ann Berry
Director/producer
Aidan Farrell Aidan Farrell
Senior Colourist,
The Farm Group, UK
Pete Collins Pete Collins
Head of Scripted Pipeline,
The Farm Group, UK

Video: The ST 2094 Standards Suite For Dynamic Metadata

Lars Borg explains to us what problems the SMPTE ST 2094 standard sets out to solve. Looking at the different types of HDR and Wide Colour Gamut (WCG) we quickly see how many permutations there are and how many ways there are to get it wrong.

ST 2094 carries the metadata needed to manage the colour, dynamic range and related data. In order to understand what’s needed, Lars takes us through the details of the HDR implementations, touching on workflows and explaining how the ability of your display affects the video.

We then look at midtones and dynamic metadata before a Q&A.

This talk is very valuable in understanding the whole HDR, WCG ecosystem as much as it is ST 2094.

Watch now!

Speaker

Lars Borg Lars Borg
Principal Scientist,
Adobe

Video: Content Production Technology on Hybrid Log-Gamma


‘Better Pixels’ is the continuing refrain from the large number of people who are dissatisfied with simply increasing resolution to 4K or even 8K. Why can’t we have a higher frame-rate instead? Why not give us a wider colour gamut (WCG)? And why not give us a higher dynamic range (HDR)? Often, they would prefer any of these 3 options over higher resolution.

Watch this video explain more, now.

Dynamic Range is the word given to describe how much of a difference there is between the smallest possible signal and the strongest possible signal. In audio, what’s the quietest things that can be heard verses the loudest thing that can be heard (without distortion). In video, what’s the difference between black and white – after all, can your TV fully simulate the brightness and power of our sun? No, what about your car’s headlights? Probably not. Can your TV go as bright as your phone flashlight – well, now that’s realistic.

So let’s say your TV can go from a very dark black to being as bright as a medium-power flashlight, what about the video that you send your TV? When there’s a white frame, do you want your TV blasting as bright as it can? HDR allows producers to control the brightness of your display device so that something that is genuinely very bright, like star, a bright light, an explosion can be represented very brightly, whereas something which is simply white, can have the right colour, but also be medium brightness. With video which is Standard Dynamic Range (SDR), there isn’t this level of control.

For films, HDR is extremely useful, but for sports too – who’s not seen a football game where the sun leaves half the pitch in shadow and half in bright sunlight? With SDR, there’s no choice but to have one half either very dark or very bright (mostly white) so you can’t actually see the game there. HDR enabled the production crew to let HDR TVs show detail in both areas of the pitch.

HLG, which stands for Hybrid Log-Gamma is the name of a way of delivering HDR video. It’s been pioneered, famously, by Japan’s NHK with the UK’s BBC and has been standardised as ARIB STV-B67. In this talk, NHK’s Yuji Nagata helps us navigate working with multiple formats; HDR HLG -> SDR, plus converting from HLG to Dolby’s HDR format called PQ.

The reality of broadcasting is that anyone who is producing a programme in HDR will have to create an SDR version at some point. The question is how to do that and when. For live, some broadcasters may need to fully automate this. In this talk, we look at a semi-automated way of doing this.

HDR is usually delivered in a Wide Colour Gamut signal such as the ITU’s BT.2020. Converting between this colour space and the more common BT.709 colour space which is part of the HD video standards, is also needed on top of the dynamic range conversion. So listen to Yugi Nagata’s talk to find out NHK’s approach to this.

NHK has pushed very hard for many years to make 8K broadcasts feasible and has in recent times focussed on tooling up in time for the the 2020 Olympics. This talk was given at the SMPTE 2017 technical conference, but is all the more relevant now as NHK up the number of 8K broadcasts approaching the opening ceremony. This work on HDR and WCG is part of making sure that the 8K format really delivers an impressive and immersive experience for those that are lucky enough to experience it. This work on the video goes hand in hand with their tireless work with audio which can deliver 22.2 multichannel surround.

Watch now!

Speaker

Yuji Nagata Yuji Nagata
Broadcast Engineer,
NHK

On-Demand Webinar: Human Perception Fundamentals – Colour, Contrast & Motion


Thursday February 7th, 10am PST / 1pm EST / 18:00 GMT
Now available on-demand!

There is so much talk about HDR, wide colour gamut (WCG), ‘Better Pixels’ and all the TVs seem to interpolate motion up to 100Hz or above, that it’s good to stop and check we know why all of this matters – and crucially when it doesn’t.

SMPTE’s new ‘Essential Technology Concepts Webcasts’ are here to help and for the first webcast, David Long will look at the fundamentals of colour, contrast and motion in terms of what we actually see.

This promises to be a great talk and, the chances are, even people who ‘know it already’ will be reminded of a thing or two!

Watch now.

Speakers

David Long David Long
Director
RIT Center for Media, Arts, Games, Interaction & Creativity
& MAGIC Spell Studios

Video: IP For Media Webcast Part II

Following on from last week’s post part II is here. Wes Simpson looks at use of IP in Remote Production/Remote Integration (REMI) and finished with a panel discussion including Newtek and Grass Valley, a Belden brand.

This video talks about:

  • Why Broadcasters need networking?
  • Typical Live remote sports broadcast roles
  • Overview of video & audio Signal types
  • HDR & Wide Colour Gamut (WCG)
  • Data (metadata, scripts etc)
  • REMI – Remote Integration, AKA ‘Remote Production’ in Europe.
  • Overview of what tasks can be done at base, what still needs to be done ‘on-site’
  • Uncompressed formats summary (SDI, 2022-6, 2110)
  • Slice-based compression
  • Mezzanine compression
  • TR-01 for carrying JPEG 2000 & audio
  • Bonded Cellular
  • Packetloss & FEC (Forward Error Correction)
  • 2022-7 – route diversity
  • Typical delays
  • Plus a panel discussion

 
Watch now!

Speakers

Wes Simpson Wes Simpson
President,
Telecom Product Consulting
Tom Butts Tom Butts
Content Director,
TV Technology

Video: An overview on 10-bit video — UHD, HDR and Coding Efficiency

In the past few years, the industry has been trying to improve the end user experience to have a higher spatial (pixels), temporal (framerate) and spectral (bitdepth) resolution.

This talk from Vimeo’s Vittorio Giovara and Ronald Bultje from Two Orioles will explore the high-bitdepth element of this improved user experience. Technically, this is usually referred to as 10-bit video, since, historically, the video user experience has been largely based on a 8-bit world. We will explain marketing terms like HDR, UHDTV, explore high bitdepth-support in commonly used video coding software, and showcase how these work together to improve your video coding efficiency and end user experience.

Introduction: Josie Keller (JWPlayer)
Presenters: Vittorio Giovara (Vimeo), Ronald Bultje (Two Orioles)

Meeting: Are Existing Broadcast Formats Suitable for HDR WCG Content?


Date: Thursday November 30, 2017 – Ample Refreshments from 18:15 GMT for 19:00 start.
Location: Ericsson Television, Strategic Park, Comines Way, Hedge End, Southampton, SO30 4DA. Google Maps

With higher resolution, wider colour gamut and extended dynamic range, the new Ultra High Definition TV (UHD) standards define a container which allows content creators to offer the consumer a much more immersive visual experience. However there are some artefacts noted within the container particularly around HDR material. Olie Bauman outlines why YCrCb are used and the human vision systems response to changes in chroma/luminance and the correlation between R, G and B

As HDR and WCG expand the Colour Volumes he will show why these increased from SD (601) to HD (709) to UHD (2020) and show the difference between PQ (Display Referred) and HLG (Scene Referred) workflows

From this background he will show examples of artefacts due to chroma down-sampling and show the different characteristics – depending on work flow.

He highlights that the problems will become greater as more content exploiting the full UHD container becomes available, requiring additional care and processing in content production and delivery.

Register Now