Video: What is NMOS? with a Secure Control Case Study

Once you’ve implemented SMPTE ST 2110‘s suite of standards on your network, you’ve still got all your work ahead of you in order to implement large-scale workflows. How are you doing to discover new devices? How will you make or change connections between devices? How will you associate audios to the video? Creating a functioning system requires an whole ecosystem of control protocols and information exchange which is exactly what AMWA, the Advanced Media Workflow Association has been working on for many years now.

Jed Deame from Nextera introduces the main specifications that have been developed to work hand-in-hand with uncompressed workflows. All prefixed with IS- which stands for ‘Interface Specificaion’, they are IS-04, IS-05, IS-08, IS-09 and IS-10. Between them they allow you to discover new devices, create connections between then, manage the association of audio with video as well as manage system-wide information. Each of these, Jed goes through in turn. The only relevant ones which are skipped are IS-06 which allows devices to communicate northbound to an SDN controller and IS-07 which manages GPI and tally information.

Jed sets the scene by describing an example ST-2110 setup with devices able to join a network, register their presence and be quickly involved in routing events. He then looks at the first specification in today’s talk, NMOS IS-04. IS-04’s job is to provide an API for nodes (cameras, monitors etc.) to use when they start up to talk to a central registry and lodge some details for further communication. The registry contains a GUID for every resource which covers nodes, devices, sources, flows, senders and receivers. IS-04 also provides a query API for controllers (for instance a control panel).

While IS-04 started off very basic, as it’s moved to version 1.4, it’s added HTTPS transport, paged queries and support for connection management with IS-05 and IS-06. IS-04 is a foundational part of the system allowing each element to have an identity, track when entities are changes and update clients accordingly.

IS-05 manages connections between senders and receivers allowing changes to be immediate or set for the future. It allows, for example, querying of a sender to get the multicast settings and provides for sending that to a receiver. Naturally, when a change has been made, it will update the IS-04 registry.

IS-08 helps manage the complexity which is wrought by allowing all audios to flow separately from the video. Whilst this is a boon for flexibility and reduces much unnecessary processing (in extracting and recombining audio) it also adds a burden of tracking which audios should be used where. IS-08 is the answer from AMWA on how to manage this complexity. This can be used in association with BCP-002 (Best Current Practice) which allows for essences in the IS-04 registry to be tagged showing how they were grouped when they were created.

Jed looks next at IS-09 which he explains provides a way for global facts of the system to be distributed to all devices. Examples of this would be whether HTTPS is in use in the facility, syslog servers, the registration server address and NMOS versions supported.

Security is the topic of the last part of talk. As we’ve seen, IS-04 already allows for encrypted API traffic, and this is mandated in the EBU’s TR-1001. However BCP 003 and IS-10 have also been created to improve this further. IS-10 deals with authorisation to make sure that only intended controllers, senders and receivers are allowed access to the system. And it’s the difference between encryption (confidentiality) and authorisation which Jed looks at next.

It’s no accident that security implementations in AMWA specifications shares a lot in common with widely deployed security practices already in use elsewhere. In fact, in security, if you can at all avoid developing your own system, you should avoid it. In use here is the PKI system and TLS encryption we use on every secure website. Jed steppes through how this works and the importance of the cipher suite which lives under TLS.

The final part of this talk is a case study where a customer required encrypted control, an authorisation server, 4K video over 1GbE, essence encryption, unified routing interface and KVM capabilities. Jed explains how this can all be achieved with the existing specifications or an extension non top of them. Extending the encryption methods for the API to essences allowed them to meet the encryption requirements and adding some other calls on top of the existing NMOS provided a unified routing interface which allowed setting modes on equipment.

Watch now!
For more information, download these slides from a SMPTE UK Section meeting on NMOS
Speakers

Jed Deame Jed Deame
CEO,
Nextera Video

Video: Using AMWA IS-06 for Flow Control on Professional Media Networks

In IP networks multicast flow subscription is usually based on a combination of IGMP (Internet Group Management Protocol) and PIM (Protocol Independent Multicast) protocols. While PIM allows for very efficient delivery of IP multicast data, it doesn’t provide bandwidth control or device authorisation.

To solve these issues on SMPTE ST 2110 professional media networks the NMOS IS-06 specification has been developed. It relies on a Software-Defined Networking, where traffic management application embedded in each single switch or router is replaced by a centralised Network Controller. This controller manages and monitors the whole network environment, making it bandwidth aware.

NMOS IS-06 specification provides a vendor agnostic Northbound interface from Network Controller to Broadcast Controller. IS-06 in conjunction with IS-04 (Discovery and Registration) and IS-05 (NMOS Device Connection Management) allows Broadcast Controller to automatically set up media flows between endpoints on the network, reserve bandwidth for flows and enforce network security. Broadcast Controller is also able to request network topology information from Network Controller, which can be used to create a user friendly graphic representation of the flows in the network.

In this presentation Rob Porter from Sony Europe explains the basics of NMOS IS-06, showing in details how setting up media flows with this specification fits into the IS-04 / IS-05 workflow. Rob emphasise that all AMWA NMOS specifications are completely open and available to anyone, allowing for interoperability between broadcast and network devices from different manufacturers.

The next speaker, Sachin Vishwarupe from Cisco Systems, focuses on the future works on IS-06, including provisioning feedback (such as insufficient bandwidth, no route available from sender to receiver or no management connectivity), flow statistics, security and grouping (similar to ”salvo” in SDI world).

There is also a discussion on extension of IS-06 specification for Network Address Translation (NAT), which would help to resolve problems caused by address conflicts e.g. when sharing resources between facilities.

You can find the slides here.

Watch now!

Speakers

Rob Porter Rob Porter
Project Manager – Advanced Technology Team
Sony Europe
Sachin Vishwarupe
Principal Engineer
Cisco Systems

Video: Wide Area Facilities Interconnect with SMPTE ST 2110

Adoption of SMPTE’s 2110 suite of standards for transport of professional media is increasing with broadcasters increasingly choosing it for use within their broadcast facility. Andy Rayner takes the stage at SMPTE 2019 to discuss the work being undertaken to manage using ST 2110 between facilities. In order to do this, he looks at how to manage the data out of the facility, the potential use of JPEG-XS, timing and control.

Long established practices of using path protection and FEC are already catered for with ST 2022-7 for seamless path protection and ST 2022-5. New to 2110 is the ability to send the separate essences bundled together in a virtual trunk. This has the benefit of avoiding streams being split up during transport and hence potentially suffering different delays. It also helps with FEC efficiency and allows transport of other types of traffic.

Timing is key for ST 2110 which is why it natively uses Precision Timing Protocol, PTP which has been formalised for use in broadcast under ST 2059. Andy highlights the problem of reconciling timing at the far end but also the ‘missed opportunity’ that the timing will usually get regenerated therefore the time of media ingest is lost. This may change over the next year.

The creation of ST 2110-22 includes, for the first time, compressed media into ST 2110. Andy mentions that JPEG XS can be used – and is already being deployed. Control is the next topic with Andy focussing on the secure sharing of NMOS IS-04 & 05 between facilities covering registration, control and the security needed.

The talk ends with questions on FEC Latency, RIST and potential downsides of GRE trunking.

Watch now!
Speaker

Andy Rayner Andy Rayner
Chief Technologist,
Nevion

Video: SMPTE Technical Primers

The Broadcast Knowledge exists to help individuals up-skill whatever your starting point. Videos like this are far too rare giving an introduction to a large number of topics. For those starting out or who need to revise a topic, this really hits the mark particularly as there are many new topics.

John Mailhot takes the lead on SMPTE 2110 explaining that it’s built on separate media (essence) flows. He covers how synchronisation is maintained and also gives an overview of the many parts of the SMPTE ST 2110 suite. He talks in more detail about the audio and metadata parts of the standard suite.

Eric Gsell discusses digital archiving and the considerations which come with deciding what formats to use. He explains colour space, the CIE model and the colour spaces we use such as 709, 2100 and P3 before turning to file formats. With the advent of HDR video and displays which can show bright video, Eric takes some time to explain why this could represent a problem for visual health as we don’t fully understand how the displays and the eye interact with this type of material. He finishes off by explaining the different ways of measuring the light output of displays and their standardisation.

Yvonne Thomas talks about the cloud starting by explaining the different between platform as a service (PaaS), infrastructure as a service (IaaS) and similar cloud terms. As cloud migrations are forecast to grow significantly, Yvonne looks at the drivers behind this and the benefits that it can bring when used in the right way. Using the cloud, Yvonne shows, can be an opportunity for improving workflows and adding more feedback and iterative refinement into your products and infrastructure.

Looking at video deployments in the cloud, Yvonne introduces video codecs AV1 and VVC both, in their own way, successors to HEVC/h.265 as well as the two transport protocols SRT and RIST which exist to reliably send video with low latency over lossy networks such as the internet. To learn more about these protocols, check out this popular talk on RIST by Merrick Ackermans and this SRT Overview.

Rounding off the primer is Linda Gedemer from Source Sound VR who introduces immersive audio, measuring sound output (SPL) from speakers and looking at the interesting problem of forward speakers in cinemas. The have long been behind the screen which has meant the screens have to be perforated to let the sound through which interferes with the sound itself. Now that cinema screens are changing to be solid screens, not completely dissimilar to large outdoor video displays, the speakers are having to move but now with them out of the line of sight, how can we keep the sound in the right place for the audience?

This video is a great summary of many of the key challenges in the industry and works well for beginners and those who just need to keep up.

Watch now!
Speakers

John Mailhot John Mailhot
Systems Architect for IP Convergence,
Imagine Communications
Eric Gsell Eric Gsell
Staff Engineer,
Dolby Laboratories
Linda Gedemer, PhD Linda Gedemer, PhD
Technical Director, VR Audio Evangelist
Source Sound VR
Yvonne Thomas Yvonne Thomas
Strategic Technologist
Digital TV Group