Video: Building Large SMPTE ST 2110 Systems Using JT-NM TR-1001-1


With the SMPTE 2110 suite of standards largely published and the related AMWA IS-04 and -05 specifications stable, people’s minds are turning to how to implement all these standards bringing them together into a complete working system.

The JT-NM TR-1001-1 is a technical recommendation document which describes a way of documenting how the system will work – for instance how do new devices on the network start up? How do they know what PTP domain is in use on the network?

John Mailhot starts by giving an overview of the standards and documents available, showing which ones are published and which are still in progress. He then looks at each of them in turn to summarise its use on the network and how it fits in to the system as a whole.

Once the groundwork is laid, we see how the JT-NM working group have looked at 5 major behaviours and what they have recommended for making them work in a scalable way. These cover things like DNS discovery, automated multicast address allocation and other considerations.

Watch now

Speaker

John Mailhot John Mailhot
CTO Networking & Infrastructure
Imagine Communications

Video: Blockchain & the Hollywood Supply Chain

At The Broadcast Knowledge, we’re continuing to cut through the hype and get to the bottom of blockchain. Now part of the NAB drinking game along with words like AI and 5G, it’s similarly not going away. The principle of blockchain is useful – just not useful everywhere.

So what can broadcasters do with Blockchain, and – given this is a SMPTE talk – what can film studios do with it? It’s doubtless that blockchain really makes secure, trusted systems possible so the mind immediately jumps to using it to ensure all the files needed to create films are distributed securely and with an audit trail.

Here, Steve Wong looks at this but explores the new possibilities this creates. He starts with the basics on what blockchain is and how it works, but soon moves in to how this could work for Hollywood explaining what could exist and what already does.

Speaker

Steve Wong Steve Wong
Cloud & Platform Services General Manager, Telecom, Media & Technology
DXC Technology

Video: Content Production Technology on Hybrid Log-Gamma


‘Better Pixels’ is the continuing refrain from the large number of people who are dissatisfied with simply increasing resolution to 4K or even 8K. Why can’t we have a higher frame-rate instead? Why not give us a wider colour gamut (WCG)? And why not give us a higher dynamic range (HDR)? Often, they would prefer any of these 3 options over higher resolution.

Watch this video explain more, now.

Dynamic Range is the word given to describe how much of a difference there is between the smallest possible signal and the strongest possible signal. In audio, what’s the quietest things that can be heard verses the loudest thing that can be heard (without distortion). In video, what’s the difference between black and white – after all, can your TV fully simulate the brightness and power of our sun? No, what about your car’s headlights? Probably not. Can your TV go as bright as your phone flashlight – well, now that’s realistic.

So let’s say your TV can go from a very dark black to being as bright as a medium-power flashlight, what about the video that you send your TV? When there’s a white frame, do you want your TV blasting as bright as it can? HDR allows producers to control the brightness of your display device so that something that is genuinely very bright, like star, a bright light, an explosion can be represented very brightly, whereas something which is simply white, can have the right colour, but also be medium brightness. With video which is Standard Dynamic Range (SDR), there isn’t this level of control.

For films, HDR is extremely useful, but for sports too – who’s not seen a football game where the sun leaves half the pitch in shadow and half in bright sunlight? With SDR, there’s no choice but to have one half either very dark or very bright (mostly white) so you can’t actually see the game there. HDR enabled the production crew to let HDR TVs show detail in both areas of the pitch.

HLG, which stands for Hybrid Log-Gamma is the name of a way of delivering HDR video. It’s been pioneered, famously, by Japan’s NHK with the UK’s BBC and has been standardised as ARIB STV-B67. In this talk, NHK’s Yuji Nagata helps us navigate working with multiple formats; HDR HLG -> SDR, plus converting from HLG to Dolby’s HDR format called PQ.

The reality of broadcasting is that anyone who is producing a programme in HDR will have to create an SDR version at some point. The question is how to do that and when. For live, some broadcasters may need to fully automate this. In this talk, we look at a semi-automated way of doing this.

HDR is usually delivered in a Wide Colour Gamut signal such as the ITU’s BT.2020. Converting between this colour space and the more common BT.709 colour space which is part of the HD video standards, is also needed on top of the dynamic range conversion. So listen to Yugi Nagata’s talk to find out NHK’s approach to this.

NHK has pushed very hard for many years to make 8K broadcasts feasible and has in recent times focussed on tooling up in time for the the 2020 Olympics. This talk was given at the SMPTE 2017 technical conference, but is all the more relevant now as NHK up the number of 8K broadcasts approaching the opening ceremony. This work on HDR and WCG is part of making sure that the 8K format really delivers an impressive and immersive experience for those that are lucky enough to experience it. This work on the video goes hand in hand with their tireless work with audio which can deliver 22.2 multichannel surround.

Watch now!

Speaker

Yuji Nagata Yuji Nagata
Broadcast Engineer,
NHK

Video: Beyond SMPTE Time Code – the TLX Project

SMPTE Time Code started off in the 1970s and has evolved yet in some ways remained unchanged. It is key to electronic video editing, and has found application in many other fields and industries including live events. This is Armin Van Buren explaining how he uses SMPTE timecode in his live DJ sets.

The more we push technology, the more we demand form timecode, so now there is the TLX project (Time Label, eXtensible) which seeks to define a new labelling system.

The webinar will provide an overview of the emerging design, and is intended to provide a preview for potential users, and to encourage participation by those with expertise to offer.

Peter Symes, the host, covers:

  • The history of timing
  • What SMPTE ST 12 is and its evolution
  • The concept of TLX
  • Use of PTP & provision for extensibility
  • What types of data can TLX convey
  • Q&A!

Watch now to hone your knowledge of the SMPTE timecode that already exists and to get ready to understand TLX.

Speakers

Peter Symes Peter Symes
Consultant,
SMPTE Fellow
Joel E. Welch Joel E. Welch
Director of Education,
SMPTE