Video: Extension to 4K resolution of a Parametric Model for Perceptual Video Quality

Measuring video quality automatically is invaluable and, for many uses, essential. But as video evolves with higher frame rates, HDR, a wider colour gamut (WCG) and higher resolutions, we need to make sure the automatic evaluations evolve too. Called ‘Objective Metrics’, these computer-based assessments go by the name of PSNR, DMOS, VMAF and others. One use for these metrics is to automatically analyse an encoded video to determine if it looks good enough and should be re-encoded. This allows for the bitrate to be optimised for quality. Rafael Sotelo, from the Universidad de Montevideo, explains how his university helped work on an update to Predicted MOS to do just this.

MOS is the Mean Opinion Score and is a result derived from a group of people watching some content in a controlled environment. They vote to say how they feel about the content and the data, when combined gives an indication of the quality of the video. The trick is to enable a computer to predict what people will say. Rafael explains how this is done looking at some of the maths behind the predicted score.

In order to test any ‘upgrades’ to the objective metric, you need to test it against people’s actual score. So Rafael explains how he set up his viewing environments in both Uruguay and Italy to be compliant with BT.500. BT.500 is a standard which explains how a room should be in order to have viewing conditions which maximise the ability of the viewers to appreciate the pros and cons of the content. For instance, it explains how dim the room should be, how reflective the screens and how they should be calibrated. The guidelines don’t apply to HDR, 4K etc. so the team devised an extension to the standard in order to carryout the testing. This is called ‘subjective testing’.

With all of this work done, Rafael shows us the benefits of using this extended metric and the results achieved.

Watch now!
Speakers

Rafael Sotelo Rafael Sotelo
Director, ICT Department
Universidad de Montevideo

Webinar: HDR Dynamic Mapping

HDR broadcast is on the rise, as we saw from the increased number of ways to watch this week’s Super Bowl in HDR, but SDR will be with us for a long time. Not only will services have to move seamlessly between SDR and HDR services, but there is a technique that allows HDR itself to be dynamically adjusted to better match the display its on.

Introduced in July 2019, content can now be more accurately represented on any specific display, particularly lower end TVs. Dynamic Mapping (DM), is applies to PQ-10 which is the 10-bit version of Dolby’s Perceptual Quantizer HDR format standardised under SMPTE ST-2084. Because HLG (ARIB STV-B67) works differently, it doesn’t need dynamic mapping. Dynamic Metadata to support this function is defined as SMPTE ST 2094-10, -40 and also as part of ETSI TS 103 433-2.

Stitching all of this together and helping us navigate delivering the best HDR is Dolby’s Jason Power and Virginie Drugeon from Panasonic in this webinar organised by DVB.

Register now!
Speakers

Virginie Drugeon Virginie Drugeon
Senior Engineer for Digital TV Standardisation, Panasonic
Chair, DVB TM-AVC Group
Jason Power Jason Power
Senior Director, Commercial Partnerships and Standards, Dolby Laboratories
Chair, DVB CM-AVC Group

Webinar: How many Nits is Color Bars?

IABM NITS Webinar

Brightness, luminance, luma, NITS and candela. What are the differences between these similar terms? If you’ve not been working closely with displays and video, you may not know but as HDR grows in adoption, it pays to have at least a passing understanding of the terms in use.

Date: Thursday January 23rd – 11am ET / 16:00 GMT

Last week, The Broadcast Knowledge covered the difference between Luma and Luminance in this video from YouTube channel DisplacedGamers. It’s a wide ranging video which explains many of the related fundamentals of human vision and analogue video much of which is relevant in this webinar.

To explain the detail of not only what these mean, but also how we use them to set up our displays, the IABM have asked Norm Hurst from SRI, often known as Sarnoff, to come in and discuss his work researching test patterns. SRI make many test patterns which show up how your display is/isn’t working and also expose some of the processing that the signal has gone through on its journey before it even got to the display. In many cases these test patterns tell their story without electronic meters or analysers, but when brightness is concerned, there can still be place for photometers, colour analysers and other associated meters.

HDR and its associated Wide Colour Gamut (WCG) bring extra complexity in ensuring your monitor is set up correctly particularly as many monitors can’t show some brightness levels and have to do their best to accommodate these requests from the incoming signal. Being able to operationally and academically assess and understand how the display is performing and affecting the video is of prime importance. Similarly colours, as ever, a prone to shifting as they are processed, attenuated and/or clipped.

This free webinar from the IABM is led by CTO Stan Moote.

Register now!
Speaker

Norm Hurst Norm Hurst
Senior Principal Research Engineer,
SRI International SARNOFF
Stan Moote Stan Moote
CTO,
IABM

Video: 2019 What did I miss? HDR Formats and Trends

The second most popular video of 2019 looked at HDR. A long promised format which routinely wows spectators at conferences and shops a like is increasingly seen, albeit tentatively, in the wild. For instance, this Christmas UK viewers were able to watch HDR Premiership football in HDR with Amazon Prime, but only a third of the matches benefitted from the format. Whilst there are many reasons for this, many of them due to commercial and practical reasons rather than technical reasons, this is an important part of the story.

Brian Alvarez from Amazon Prime Video goes into detail on the background and practicalities of HDR in this talk given at the Video Tech Seattle meet up in August, part of the world-wide movement of streaming video engineers who meet to openly swap ideas and experiences in making streaming work. We are left with a not only understanding HDR better, but with a great insight into the state of the consumer market – who can watch HDR and in what format – as well as who’s transmitting HDR.

Read more about the video or just hit play below!

If you want to start from the beginning on HDR, check out the other videos on the topic. HDR relies on both the understanding of how people see, the way we describe colour and light, how we implement it and how theworkflows are modified to suit. Fortunately, you’re already at the one place that brings all this together! Explore, learn and enjoy.

Speaker

Brian Alvarez Brian Alvarez
Principal Product Manager,
Amazon Prime Video