Video: ST 2110 – From Theory to Reality

Delivering an all-IP truck is no mean feat. tpc explains what they learnt, what went well and how they succeeded in delivering a truck which takes no longer to fire up than a traditional SDI truck.

A common questions among people considering a move to IP is ‘do I need to?’ and ‘how can I get ready?’. Here at The Broadcast Knowledge we always say ‘find a small project, get it working, learn what goes wrong and then plan the one you really wanted to do.’ The Swiss broadcasting service provider ‘Technology and Production Centre’, known as ‘tpc’, has done just that.

tpc is currently working on the Metechno project – a large all-IP news, sports and technology center for Swiss radio and television. In order to acquire necessary experience with the SMPTE ST 2110 standard, tpc designed the UHD1 OB van ahead of time which has been used in TV production for 6 months now. In this video, Andreas Lattmann shares the vision of the Metechno Project and, critically, his experiences related to the design and use of the truck.

The UHD1 is a 24-camera OB van with all IP core based on Arista switches with non-blocking architecture. It is the equivalent of an 184-square UHD SDI system however, it can be expanded by adding additional line cards to network switches. The truck is format agnostic, supporting both HD and UHD formats in HDR and SDR. IP gateways are incorporated for SDI equipment.

The SMPTE ST 2110 specification separates video and audio into discrete essence streams which boosts efficiency and flexibility, but we hear in this talk that more attention to latency (lip sync) is required compared to SDI systems. Andreas talks about the flexibility this truck provides with up-/down-conversion, color-correction for any video plus how IP has enabled full flexibility in what can be routed to the multiviewer screens.

Anderas spends some time discussing redundancy and how IP enables full redundancy – an improvement over many SDI infrastructures and how SMPTE’s ST 2022-7 standard makes this possible.

The main GUI is based on a Lawo VSM control system which aims to deliver a familiar experience for operators who used to work in the SDI domain. Network training has been provided for all operators because troubleshooting has changed significantly with the introduction of essences over IP. This is not least because NMOS IS-04 and 05 standards were not mature enough during design of the truck, so all IP connections had to be managed manually. With more than 50 thousand IP addresses in this system, AMWA’s NMOS IS-04 which manages discovery and registration and IS-05 which manages the setup and take-down of connections would have helped significantly in the lean management of the truck.

Lattmann emphasizes importance of using open standards like SMPTE ST 2110 instead of proprietary solutions. That allows you to choose the best components and not rely on a single manufacturer.

The learning’s the Andreas presents us involve difficulties with PTP, IP training, the benefits of flexibility. From a video point of view, Andreas presents his experiences with HDR->SDR workflows, focussing in HDR and UHD.

Watch now!

Speaker

Andreas Lattmann Andreas Lattmann
CTO, Head of Planning & Projects
tpc Switzerland AG

Webinar: Crafting quality: Skills for successful UHD and HDR production

Webinar date: Thursday May 30th 2019
Time: 16:00 BST / 11 am EST / 8 am PDT

Experienced advice is on hand in this webinar for those producing in HDR and UHD. Productions are always trying to raise the quality of acquisition in order to deliver better quality to the viewers, to enhance creative possibilities and to maximise financial gain by future proofing their archives. But this push always brings challenges in production and the move to UHD and HDR is no different.

HDR and UHD are not synonymous, but often do go hand-in-hand. This is partly because the move to UHD is a move to improve quality, but time and again we hear the reasons that increasing resolution in and of itself is not always an improvement. Rather the ‘better pixels’ mantra seeks to improve quality through improving the video using a combination of resolution, frame-rate, HDR and Wide Colour Gamut (WCG). So when it’s possible, HDR and WCG are often combined with UHD.

In this webinar, we hear the challenges on the way to success met by director and producer Pamela Ann Berry and The Farm Group. Register to hear them share their tips and tricks for better UHD and HDR production.

Register now!

Speakers

Pamela Ann Berry Pamela Ann Berry
Director/producer
Aidan Farrell Aidan Farrell
Senior Colourist,
The Farm Group, UK
Pete Collins Pete Collins
Head of Scripted Pipeline,
The Farm Group, UK

Video: The ST 2094 Standards Suite For Dynamic Metadata

Lars Borg explains to us what problems the SMPTE ST 2094 standard sets out to solve. Looking at the different types of HDR and Wide Colour Gamut (WCG) we quickly see how many permutations there are and how many ways there are to get it wrong.

ST 2094 carries the metadata needed to manage the colour, dynamic range and related data. In order to understand what’s needed, Lars takes us through the details of the HDR implementations, touching on workflows and explaining how the ability of your display affects the video.

We then look at midtones and dynamic metadata before a Q&A.

This talk is very valuable in understanding the whole HDR, WCG ecosystem as much as it is ST 2094.

Watch now!

Speaker

Lars Borg Lars Borg
Principal Scientist,
Adobe

Video: Colour

With the advent of digital video, the people in the middle of the broadcast chain have little do to with colour for the most part. Yet those in post production, acquisition and decoding/display are finding it life more and more difficult as we continue to expand colour gamut and deliver on new displays.

Google’s Steven Robertson takes us comprehensively though the challenges of colour from the fundamentals of sight to the intricacies of dealing with REC 601, 709, BT 2020, HDR, YUV transforms and all the mistakes people make in between.

An approachable talk which gives a great overview, raises good points and goes into detail where necessary.

An interesting point of view is that colour subsampling should die. After all, we’re now at a point where we could feed an encoded with 4:4:4 video and get it to compress the colour channels more than the luminance channel. Steven says that this would generate more accurate colour than by stripping it of a fixed amount of data like 4:2:2 subsampling does.

Given at Brightcove HQ as part of the San Francisco Video Tech meet-ups.

Watch now!

Speaker

Steven Robertson Steven Robertson
Software Engineer,
Google

Video: Content Production Technology on Hybrid Log-Gamma


‘Better Pixels’ is the continuing refrain from the large number of people who are dissatisfied with simply increasing resolution to 4K or even 8K. Why can’t we have a higher frame-rate instead? Why not give us a wider colour gamut (WCG)? And why not give us a higher dynamic range (HDR)? Often, they would prefer any of these 3 options over higher resolution.

Watch this video explain more, now.

Dynamic Range is the word given to describe how much of a difference there is between the smallest possible signal and the strongest possible signal. In audio, what’s the quietest things that can be heard verses the loudest thing that can be heard (without distortion). In video, what’s the difference between black and white – after all, can your TV fully simulate the brightness and power of our sun? No, what about your car’s headlights? Probably not. Can your TV go as bright as your phone flashlight – well, now that’s realistic.

So let’s say your TV can go from a very dark black to being as bright as a medium-power flashlight, what about the video that you send your TV? When there’s a white frame, do you want your TV blasting as bright as it can? HDR allows producers to control the brightness of your display device so that something that is genuinely very bright, like star, a bright light, an explosion can be represented very brightly, whereas something which is simply white, can have the right colour, but also be medium brightness. With video which is Standard Dynamic Range (SDR), there isn’t this level of control.

For films, HDR is extremely useful, but for sports too – who’s not seen a football game where the sun leaves half the pitch in shadow and half in bright sunlight? With SDR, there’s no choice but to have one half either very dark or very bright (mostly white) so you can’t actually see the game there. HDR enabled the production crew to let HDR TVs show detail in both areas of the pitch.

HLG, which stands for Hybrid Log-Gamma is the name of a way of delivering HDR video. It’s been pioneered, famously, by Japan’s NHK with the UK’s BBC and has been standardised as ARIB STV-B67. In this talk, NHK’s Yuji Nagata helps us navigate working with multiple formats; HDR HLG -> SDR, plus converting from HLG to Dolby’s HDR format called PQ.

The reality of broadcasting is that anyone who is producing a programme in HDR will have to create an SDR version at some point. The question is how to do that and when. For live, some broadcasters may need to fully automate this. In this talk, we look at a semi-automated way of doing this.

HDR is usually delivered in a Wide Colour Gamut signal such as the ITU’s BT.2020. Converting between this colour space and the more common BT.709 colour space which is part of the HD video standards, is also needed on top of the dynamic range conversion. So listen to Yugi Nagata’s talk to find out NHK’s approach to this.

NHK has pushed very hard for many years to make 8K broadcasts feasible and has in recent times focussed on tooling up in time for the the 2020 Olympics. This talk was given at the SMPTE 2017 technical conference, but is all the more relevant now as NHK up the number of 8K broadcasts approaching the opening ceremony. This work on HDR and WCG is part of making sure that the 8K format really delivers an impressive and immersive experience for those that are lucky enough to experience it. This work on the video goes hand in hand with their tireless work with audio which can deliver 22.2 multichannel surround.

Watch now!

Speaker

Yuji Nagata Yuji Nagata
Broadcast Engineer,
NHK

Webinar: Video Delivery Trends


Date: Thursday February 28th 2019, 10am PT / 1PM ET / 18:00 GMT

Streaming continues to grow, in amount streamed, in people consuming it and in importance within this and other industries. One things which has always been an enabler yet made streaming harder to deploy is its rapid evolution. Whilst this has been a boon for smaller, nimbler companies – both content producers and service providers – the streaming has now arrived at most companies in one way or another and this breadth of use-cases has kept streaming tech moving forward and showing no signs of abatement.

Some aspects are changing. For instance we are seeing the first patent-free MPEG standard proposals (EVC, which has basic patent-free functionality and a better performing patent-controlled profile) on the heels of AV1. We’re seeing low-latency efforts such as CMAF taking hold as an alternative to WebRTC. With CMAF being much closer to the ever popular HLS, this may well beat out WebRTC in deployments at the cost of a slightly higher, but much improved latency.

To bring all of this in to focus for 2019, Jason Thibeault from the Streaming Video Alliance is bringing together a panel of experts to look at the coming trends and to give us an idea of what to look out for, and how to make sense, of 2019’s year of video delivery.

Register now!

Speakers

Guillaume Bichot Guillaume Bichot
Head of Exploration,
Broadpeak
Joshua Pressnell Joshua Pressnell
Chief Technology Officer,
Penthera
Pierre-Louis Theron Pierre-Louis Theron
CEO & Co-founder
Streamroot
Johan Bolin Johan Bolin
Chief Product & Technology Officer,
Edgeware AB
Steve Miller-Jones Steve Miller-Jones
Vice President of Product Strategy
Limelight Networks
Jason Thibeault Moderator:
Jason Thibeault

Executive Director
Streaming Video Alliance

On-Demand Webinar: Human Perception Fundamentals – Colour, Contrast & Motion


Thursday February 7th, 10am PST / 1pm EST / 18:00 GMT
Now available on-demand!

There is so much talk about HDR, wide colour gamut (WCG), ‘Better Pixels’ and all the TVs seem to interpolate motion up to 100Hz or above, that it’s good to stop and check we know why all of this matters – and crucially when it doesn’t.

SMPTE’s new ‘Essential Technology Concepts Webcasts’ are here to help and for the first webcast, David Long will look at the fundamentals of colour, contrast and motion in terms of what we actually see.

This promises to be a great talk and, the chances are, even people who ‘know it already’ will be reminded of a thing or two!

Watch now.

Speakers

David Long David Long
Director
RIT Center for Media, Arts, Games, Interaction & Creativity
& MAGIC Spell Studios

Video: IP For Media Webcast Part II

Following on from last week’s post part II is here. Wes Simpson looks at use of IP in Remote Production/Remote Integration (REMI) and finished with a panel discussion including Newtek and Grass Valley, a Belden brand.

This video talks about:

  • Why Broadcasters need networking?
  • Typical Live remote sports broadcast roles
  • Overview of video & audio Signal types
  • HDR & Wide Colour Gamut (WCG)
  • Data (metadata, scripts etc)
  • REMI – Remote Integration, AKA ‘Remote Production’ in Europe.
  • Overview of what tasks can be done at base, what still needs to be done ‘on-site’
  • Uncompressed formats summary (SDI, 2022-6, 2110)
  • Slice-based compression
  • Mezzanine compression
  • TR-01 for carrying JPEG 2000 & audio
  • Bonded Cellular
  • Packetloss & FEC (Forward Error Correction)
  • 2022-7 – route diversity
  • Typical delays
  • Plus a panel discussion

 
Watch now!

Speakers

Wes Simpson Wes Simpson
President,
Telecom Product Consulting
Tom Butts Tom Butts
Content Director,
TV Technology

Video: Visual Excellence in Production

In this Tech Talk we shall hear from researchers and vision scientists, how they are ensuring the precision of HDR and colour in image capture.

Today’s imaging technology strives to produce a viewing experience which is, as far as possible, identical with that perceived by the human visual system. Strangely, one limiting factor in high dynamic range (HDR) design has been that existing measurements of the human vision have not been sufficiently accurate. Another of these issues is skin tone: humans are particularly sensitive to skin colour – regarding it as an indicator of well-being. The accurate portrayal of this subtle parameter is therefore particularly important. A further interesting image quality issue is slow motion – here we explore the development of an 8K UHD 240fps camera and slow motion capture and replay server.

Watch now!

Speakers

Lucien Lenzen Lucien Lenzen
Research Assistant
Hochschule RheinMain
Simon Thompson Simon Thompson
Project R&D Engineer
BBC
Patrick Morvan Patrick Morvan
Senior R&D Engineer
Technicolor
Simon Gauntlett Simon Gauntlett
Director of Imaging Standards and Technology
Dolby Laboratories

Video: x265 – An Update

From VideoLAN’s Video Dev Days event 2018, this talk discusses the latest updates to x265, a free software library and application for encoding video streams into the H.265/MPEG-H HEVC compression format, released under GNU GPL.

Pradeep Ramachandran, Principal Engineer at Multicore takes us through:

  • The highlights of the last year
  • HDR Encoding
  • AVX-512 optimisation
  • ABR Streaming Improvements
  • Chunked Encoding Support
  • Improving the footprint of x265

Watch Now!