Video: The ST 2094 Standards Suite For Dynamic Metadata

Lars Borg explains to us what problems the SMPTE ST 2094 standard sets out to solve. Looking at the different types of HDR and Wide Colour Gamut (WCG) we quickly see how many permutations there are and how many ways there are to get it wrong.

ST 2094 carries the metadata needed to manage the colour, dynamic range and related data. In order to understand what’s needed, Lars takes us through the details of the HDR implementations, touching on workflows and explaining how the ability of your display affects the video.

We then look at midtones and dynamic metadata before a Q&A.

This talk is very valuable in understanding the whole HDR, WCG ecosystem as much as it is ST 2094.

Watch now!

Speaker

Lars Borg Lars Borg
Principal Scientist,
Adobe

Video: Content Production Technology on Hybrid Log-Gamma


‘Better Pixels’ is the continuing refrain from the large number of people who are dissatisfied with simply increasing resolution to 4K or even 8K. Why can’t we have a higher frame-rate instead? Why not give us a wider colour gamut (WCG)? And why not give us a higher dynamic range (HDR)? Often, they would prefer any of these 3 options over higher resolution.

Watch this video explain more, now.

Dynamic Range is the word given to describe how much of a difference there is between the smallest possible signal and the strongest possible signal. In audio, what’s the quietest things that can be heard verses the loudest thing that can be heard (without distortion). In video, what’s the difference between black and white – after all, can your TV fully simulate the brightness and power of our sun? No, what about your car’s headlights? Probably not. Can your TV go as bright as your phone flashlight – well, now that’s realistic.

So let’s say your TV can go from a very dark black to being as bright as a medium-power flashlight, what about the video that you send your TV? When there’s a white frame, do you want your TV blasting as bright as it can? HDR allows producers to control the brightness of your display device so that something that is genuinely very bright, like star, a bright light, an explosion can be represented very brightly, whereas something which is simply white, can have the right colour, but also be medium brightness. With video which is Standard Dynamic Range (SDR), there isn’t this level of control.

For films, HDR is extremely useful, but for sports too – who’s not seen a football game where the sun leaves half the pitch in shadow and half in bright sunlight? With SDR, there’s no choice but to have one half either very dark or very bright (mostly white) so you can’t actually see the game there. HDR enabled the production crew to let HDR TVs show detail in both areas of the pitch.

HLG, which stands for Hybrid Log-Gamma is the name of a way of delivering HDR video. It’s been pioneered, famously, by Japan’s NHK with the UK’s BBC and has been standardised as ARIB STV-B67. In this talk, NHK’s Yuji Nagata helps us navigate working with multiple formats; HDR HLG -> SDR, plus converting from HLG to Dolby’s HDR format called PQ.

The reality of broadcasting is that anyone who is producing a programme in HDR will have to create an SDR version at some point. The question is how to do that and when. For live, some broadcasters may need to fully automate this. In this talk, we look at a semi-automated way of doing this.

HDR is usually delivered in a Wide Colour Gamut signal such as the ITU’s BT.2020. Converting between this colour space and the more common BT.709 colour space which is part of the HD video standards, is also needed on top of the dynamic range conversion. So listen to Yugi Nagata’s talk to find out NHK’s approach to this.

NHK has pushed very hard for many years to make 8K broadcasts feasible and has in recent times focussed on tooling up in time for the the 2020 Olympics. This talk was given at the SMPTE 2017 technical conference, but is all the more relevant now as NHK up the number of 8K broadcasts approaching the opening ceremony. This work on HDR and WCG is part of making sure that the 8K format really delivers an impressive and immersive experience for those that are lucky enough to experience it. This work on the video goes hand in hand with their tireless work with audio which can deliver 22.2 multichannel surround.

Watch now!

Speaker

Yuji Nagata Yuji Nagata
Broadcast Engineer,
NHK

Video: High Dynamic Range in AV1

Google and University of Warwick explain AV1, its current status and how its can support HDR with WMG’s trueDR at the IABM’s Future Trends Theatre at IBC 2018.

Matt Frost from Chrome Media takes the stage first giving an overview of AV1’s goals, objective quality tests and where we are in AV1’s timeline as well as answering many questions from the audience.

Next Alan Chalmers from University of Warwick explains how they added HLG, HDR10 (PQ) to AV1. Also added are new, scene-referenced, HDR methods which Alan explains the works of and reasons for.

Watch now!
Speakers

Alan Chalmers Alan Chalmers
Professor
WMG & trueDR, University of Warwick
Matt Frost Matt Frost
Director, Product Management,
Google Chrome Media

Video: x265 – An Update

From VideoLAN’s Video Dev Days event 2018, this talk discusses the latest updates to x265, a free software library and application for encoding video streams into the H.265/MPEG-H HEVC compression format, released under GNU GPL.

Pradeep Ramachandran, Principal Engineer at Multicore takes us through:

  • The highlights of the last year
  • HDR Encoding
  • AVX-512 optimisation
  • ABR Streaming Improvements
  • Chunked Encoding Support
  • Improving the footprint of x265

Watch Now!

Video: HDR & Wide Color Gamut

A recorded webinar from Axon CTO, Peter Schut about HDR and Wide Colour Gamut (WCG) covering

  • The basics
  • What’s the difference between HDR and WCG?
  • CIE colour charts
  • Puzzling things
  • Surprising things
  • Standards
  • Curves and Look Up Tables (LUTs)

Peter Schut

Watch now!

Video: Lights, Colors, Artwork


Mark Watson from Netflix examines how to combine SDR and HDR video from many sources into one seamless experience. How do you manage the different colour spaces, the difference in dynamic range, the types of HDR? Mark talks about how the different delivery formats differ and presents ways in which they can be unified, representing the work that Netflix is putting in to create a rich, seamless and dynamic auto-playing user experience.

Watch now!

Webinar: HDR – Bright prospects ahead, but where are we now?


Time: 15:00 BST, Wednesday June 13th 2018

A review of current technology and real-world deployments

  • The WOW factor: Why HDR?
  • HDR standards: HLG, PQ or HDR10 variants?
  • Content availability: HD or UHD?
  • Consumer displays: Mobile phones or 4K/8K TV?
  • HDR distribution: Broadcast, OTT or 4G/5G?
  • Real world deployments

The competition for viewers’ eyeballs and their disposable income has never been fiercer. Great picture quality is one weapon that service providers – especially broadcasters – can deploy to attract and retain viewers.

It’s true that millions of 4K ready TVs have been sold, but in practice most TVs sold before 2017 don’t have any support for HDR at all. Many different variants of HDR have also emerged in an attempt to offer higher quality coupled with some backwards compatibility with those early TVs, but broadcasters have been perhaps understandably reluctant to commit to producing 4K or HDR content with the costs of the ill-fated 3DTV still on their books.

This webinar looks at HDR in general and the different variants that have emerged. The drive for 4K, or even 8K, content and displays is contrasted with consumers’ willingness to watch full HD with HDR on the latest mobile phone displays…
Register Now!

Speakers:

David Smith
Technology Manager
Rohde & Schwarz

Andy Quested
Technology Strategy & Architecture
BBC Design + Engineering

Paul Clennell
Chief Technology Officer
dock10

Meeting: Are Existing Broadcast Formats Suitable for HDR WCG Content?


Date: Thursday November 30, 2017 – Ample Refreshments from 18:15 GMT for 19:00 start.
Location: Ericsson Television, Strategic Park, Comines Way, Hedge End, Southampton, SO30 4DA. Google Maps

With higher resolution, wider colour gamut and extended dynamic range, the new Ultra High Definition TV (UHD) standards define a container which allows content creators to offer the consumer a much more immersive visual experience. However there are some artefacts noted within the container particularly around HDR material. Olie Bauman outlines why YCrCb are used and the human vision systems response to changes in chroma/luminance and the correlation between R, G and B

As HDR and WCG expand the Colour Volumes he will show why these increased from SD (601) to HD (709) to UHD (2020) and show the difference between PQ (Display Referred) and HLG (Scene Referred) workflows

From this background he will show examples of artefacts due to chroma down-sampling and show the different characteristics – depending on work flow.

He highlights that the problems will become greater as more content exploiting the full UHD container becomes available, requiring additional care and processing in content production and delivery.

Register Now

On-Demand: DVB UHD HDR Webinar


On-Demand Webinar

DVB recently updated its audio-visual coding specification, adding support for High Dynamic Range (HDR), Higher Frame Rates (HFR) and Next Generation Audio (NGA). You can now learn all about the new features in a webinar by the editor of this impressive specification, Virginie Drugeon (Panasonic) on January 18th, 2017. The webinar and Q&A time should take around 1 hour. You can send your questions by the Webex chat function during the webinar and questions will be answered in a few blocks during the webinar.

The specification update has been published as BlueBook A157 and will be passed to ETSI for formal publication as TS 101 154 v2.3.1.

High Dynamic Range (HDR) significantly increases the contrast ratio and results in pictures with more ‘sparkle’. The DVB HDR solution supports Hybrid Log Gamma (HLG) and Perceptual Quantizer (PQ) transfer functions. Furthermore, the new specification defines Higher Frame Rates (HFR), offering sharper images of moving objects by going beyond the current 50/60 frames per second. When it comes to audio, DVB has added the latest Next Generation Audio (NGA) schemes to provide immersive and personalized audio content using object- or scene-based coding.

Watch Now