Brightness, luminance, luma, NITS and candela. What are the differences between these similar terms? If you’ve not been working closely with displays and video, you may not know but as HDR grows in adoption, it pays to have at least a passing understanding of the terms in use.
Date: Thursday January 23rd – 11am ET / 16:00 GMT
Last week, The Broadcast Knowledge covered the difference between Luma and Luminance in this video from YouTube channel DisplacedGamers. It’s a wide ranging video which explains many of the related fundamentals of human vision and analogue video much of which is relevant in this webinar.
To explain the detail of not only what these mean, but also how we use them to set up our displays, the IABM have asked Norm Hurst from SRI, often known as Sarnoff, to come in and discuss his work researching test patterns. SRI make many test patterns which show up how your display is/isn’t working and also expose some of the processing that the signal has gone through on its journey before it even got to the display. In many cases these test patterns tell their story without electronic meters or analysers, but when brightness is concerned, there can still be place for photometers, colour analysers and other associated meters.
HDR and its associated Wide Colour Gamut (WCG) bring extra complexity in ensuring your monitor is set up correctly particularly as many monitors can’t show some brightness levels and have to do their best to accommodate these requests from the incoming signal. Being able to operationally and academically assess and understand how the display is performing and affecting the video is of prime importance. Similarly colours, as ever, a prone to shifting as they are processed, attenuated and/or clipped.
This free webinar from the IABM is led by CTO Stan Moote.
There are many video fundamentals in today’s video looking at how we see light and how we can represent it in a video signal. Following on from last week’s look at analogue 525-line video we take a deeper dive in to light and colour.
The video starts by examining how white light can be split into colours, primaries, and how these can be re-combined in different amounts to create different colours. It then moves on to examining how the proportion of colours which create ‘white’ light isn’t as even as you might imagine. This allows us to understand how to create brighter and dimmer light which is called the luminance. We’re introduced to the CIE 2d and 3d colour graphs helping us to understand colour space and colour volume
Modern video, even if analogue, is acquired with red, green and blue as separate signals. This means if we want a grey-scale video signal, i.e. luminance only, we need to combine using the proportions discussed earlier. This biased version of luminance is what is called ‘luma’ explains the video from the Displaced Gamers YouTube Channel.
On top of human perception, much of the 20th century was dominated by CRT (Cathode Ray Tube) TVs, which don’t respond linearly to electrical voltage, meaning if you double the voltage, the brightness doesn’t necessary double. In order to compensate for that, ‘gamma correction’ is applied on acquisition so that playback on a CRT produces a linear response.
Pleasantly, an oscillator is wheeled out next looking at a real analogue video waveform demonstrating the shape of not only the sync pulses but the luminance waveform itself and how the on-screen rendition would be seen on a TV. The video then finishes with a brief look at colour addition NTSC, PAL, SECAM signals. A prelude, perhaps, to a future video.
With an enjoyable retro feel, this accessible video on understanding how analogue video works is useful for those who have to work with SDI rasters, interlaced video, black and burst, subtitles and more. It’ll remind those of us who once knew, a few things since forgotten and is an enjoyable primer on the topic for anyone coming in fresh.
Displaced Gamers is a YouTube channel and their focus on video games is an enjoyable addition to this video which starts by explaining why analogue 525-line video is the same as 480i. Using a slow-motion video of a CRT (Cathode Ray Tube) TV, the video explains the interlacing technique and why consoles/computers would often use 240p.
We then move on to timing looking at the time spent drawing a line of video, 52.7 microseconds, and the need for horizontal and vertical blanking. Blanking periods, the video explains are there to cover the time that the CRT TV would spend moving the electron beam from one side of the TV to the other. As this was achieved by electromagnets, while these were changing their magnetic level, and hence the position of the beam, the beam would need to be turned off – blanked.
The importance of these housekeeping manoeuvres for older computers was that this was time they could use to perform calculations, free from the task of writing data in to the video buffer. But this was not just useful for computers, broadcasters could use some of the blanking to insert data – and they still do. We see in this video a VHS video played with the blanking clearly visible and the data lines flashing away.
For those who work with this technology still, for those who like history, for those who are intellectually curious and for those who like reminiscing, this is an enjoyable video and ideal for sharing with colleagues.