GPI was not without its complexities, but the simplicity of its function in terms of putting a short or a voltage on a wire, is unmatched by any other system we use in broadcasting. So the question here is, how do we do ‘GPI’ with IP given all the complexity, and perceived delay, in networked communication. CTO of Pebble Beach, Miroslav Jeras, is here to explain.
The key to understanding the power of the new specification for GPI from NMOS called IS-07 is to realise that it’s not trying to emulate DC electronics. Rather, by adding the timing information available from the PTP clock, the GPI trigger can now become extremely accurate – down to the audio sample – meaning you can now use GPI to indicate much more detailed situations. On top of that, the GPI messages can contain a number of different data types, which expands the ability of these GPI messages and also helps interoperability between systems.
Miroslav explains the ways in which these messages are passed over the network and how IS-07 interacts with the other specifications such as IS-05 and BCP-002-01. He explains how IS-07 was used in the Techno Project – tpc, Zurich and then takes us through a range of different examples of how IS-07 can be used including synchronisation of the GUI and monitoring as well as routing based on GPI.
Adoption of SMPTE’s 2110 suite if standards for transport of professional media is increasing with broadcasters increasingly choosing it for use within their broadcast facility. Andy Rayner takes the stage at SMPTE 2019 to discuss the work being undertaken to manage using ST 2110 between facilities. In order to do this, he looks at how to manage the data out of the facility, the potential use of JPEG-XS, timing and control.
Long established practices of using path protection and FEC are already catered for with ST 2022-7 for seamless path protection and ST 2022-5. New to 2110 is the ability to send the separate essences bundled together in a virtual trunk. This has the benefit of avoiding streams being split up during transport and hence potentially suffering different delays. It also helps with FEC efficiency and allows transport of other types of traffic.
Timing is key for ST 2110 which is why it natively uses Precision Timing Protocol, PTP which has been formalised for use in broadcast under ST 2059. Andy highlights the problem of reconciling timing at the far end but also the ‘missed opportunity’ that the timing will usually get regenerated therefore the time of media ingest is lost. This may change over the next year.
The creation of ST 2110-22 includes, for the first time, compressed media into ST 2110. Andy mentions that JPEG XS can be used – and is already being deployed. Control is the next topic with Andy focussing on the secure sharing of NMOS IS-04 & 05 between facilities covering registration, control and the security needed.
The talk ends with questions on FEC Latency, RIST and potential downsides of GRE trunking.
The Broadcast Knowledge exists to help individuals up-skill whatever your starting point. Videos like this are far too rare giving an introduction to a large number of topics. For those starting out or who need to revise a topic, this really hits the mark particularly as there are many new topics.
John Mailhot takes the lead on SMPTE 2110 explaining that it’s built on separate media (essence) flows. He covers how synchronisation is maintained and also gives an overview of the many parts of the SMPTE ST 2110 suite. He talks in more detail about the audio and metadata parts of the standard suite.
Eric Gsell discusses digital archiving and the considerations which come with deciding what formats to use. He explains colour space, the CIE model and the colour spaces we use such as 709, 2100 and P3 before turning to file formats. With the advent of HDR video and displays which can show bright video, Eric takes some time to explain why this could represent a problem for visual health as we don’t fully understand how the displays and the eye interact with this type of material. He finishes off by explaining the different ways of measuring the light output of displays and their standardisation.
Yvonne Thomas talks about the cloud starting by explaining the different between platform as a service (PaaS), infrastructure as a service (IaaS) and similar cloud terms. As cloud migrations are forecast to grow significantly, Yvonne looks at the drivers behind this and the benefits that it can bring when used in the right way. Using the cloud, Yvonne shows, can be an opportunity for improving workflows and adding more feedback and iterative refinement into your products and infrastructure.
Looking at video deployments in the cloud, Yvonne introduces video codecs AV1 and VVC both, in their own way, successors to HEVC/h.265 as well as the two transport protocols SRT and RIST which exist to reliably send video with low latency over lossy networks such as the internet. To learn more about these protocols, check out this popular talk on RIST by Merrick Ackermans and this SRT Overview.
Rounding off the primer is Linda Gedemer from Source Sound VR who introduces immersive audio, measuring sound output (SPL) from speakers and looking at the interesting problem of forward speakers in cinemas. The have long been behind the screen which has meant the screens have to be perforated to let the sound through which interferes with the sound itself. Now that cinema screens are changing to be solid screens, not completely dissimilar to large outdoor video displays, the speakers are having to move but now with them out of the line of sight, how can we keep the sound in the right place for the audience?
This video is a great summary of many of the key challenges in the industry and works well for beginners and those who just need to keep up.
French broadcast company M6 Group has recently moved to an all-IP workflow, employing the SMPTE ST 2110 suite of standards for professional media delivery over IP networks. The two main playout channels and MCR have been already upgraded and the next few channels will be transitioned to the new core soon.
The M6 system comprises equipment from five different vendors (Evertz, Tektronix, Harmonic, Ross and TSL), all managed and controlled using the AMWA NMOS IS-04 and IS-05 specifications. Such interoperability is an inherent feature of SMPTE ST 2110 suite of standards allowing customers to focus on the operational workflows and flexibility that IP brings them. Centralised management and configuration of the system is provided through web interfaces which also allows for easy and automated addition of a new equipment.
Thanks to Software Defined Orchestration and intuitive touch screen interfaces information such as source paths, link bandwidth / status, and device details can be quickly accessed via a web GUI. As the system is based on IP network, it is possible to come in and out of fabric numerous times without the same costs implications that you would have in the SDI world. Every point of the signal chain can be easily visualised which enables broadcast engineers to maintain and configure the system with ease.