A lot of our time on this website is devoted to understanding the changes we are going through now, but we don’t adopt technology for the sake of it. Where’s this leading and what work is going on now to forge our path? Whilst SMPTE ST 2110 and the associated specifications aren’t yet a mature technology in that sense SDI, we’re past the early adopter phase and we can see which of the industry’s needs aren’t yet met.
Andy Rayner from Nevion is here to help us navigate the current technology space and understand the future he and Nevion envision. The beginning of the video shows the big change in process from the workflows of the 90s where the TV station moved to sports events to now where we bring the event to the broadcaster in the form of a light connectivity truck turning up and deploying cameras at the event leaving most people either at home or back at base doing the production there. Andy has been involved in a number of implementations enabling this such as at Discovery’s Eurosport where the media processing is done in two locations separate from the production rooms around Europe.
Generalising around the Discovery case study, Andy shows a vision of how many companys will evolve their workflows which includes using 5G, public and private clouds as appropriate and including control surfaces being at home. To get there, Andy lays out the work within AMWA and SMPTE creating the specifications and standards that we need. He then shows how the increasing use of IT in live production, the already IT-based NLE workflows are able to integrate much better.
Looking to the future, Andy explains the work ongoing to specify a standard way of getting video into and out of the cloud including specifying a way of carrying 2110 on the WAN, helping RIST and formalising the use of JPEG XS. Andy anticipates a more standardised future where a best of breed system is possible down to individual logical components like ‘video keyer’ and ‘logo insertion’ could be done by separate software but which seamlessly integrate. Lastly, Andy promises us that work is underway to improve timing within 2110 and 2110-associated workflows.
Timing is both everything and nothing. Although much fuss is made of timing, often it’s not important. But when it is important, it can be absolutely critical. Helping us navigate through the broadcast chains varying dependence on a central co-ordinated time source is Nevion’s Andy Rayner in this talk at the VSF’s VidTrans21. When it comes down to it, you need time for coordination. In the 1840s, the UK introduced ‘Railway time’ bringing each station’s clock into line with GMT to coordinate people and trains.
For broadcast, working with multiple signals in a low-latency workflow is the time we’re most likely to need synchronisation such as in a vision or audio mixer. Andy shows us some of the original television technology where the camera had to be directly synchronised to the display. This is the era timing came from, built on by analogue video and RF transmission systems which had components whose timing relied on those earlier in the chain. Andy brings us into the digital world reminding us of the ever-useful blanking areas of the video raster which we packed with non-video data. Now, as many people move to SMPTE’s ST 2110 there is still a timing legacy as we see that some devices are still generating data with gaps where the blanking of the video would be even though 2110 has no blanking. This means we have to have timing modes for linear and non-linear delivery of video.
In ST 2110 every packet is marked with a reduced resolution timestamp from PTP, the Precision Time Protocol (or See all our PTP articles). This allows highly accurate alignment of essences when bringing them together as even a slight offset between audios can create comb filters and destroy the sound. The idea of the PTP timestamp is to stamp the time the source was acquired. But Andy laments that in ST 2110 it’s hard to keep this timestamp since interim functions (e.g. graphics generators) may restamp the PTP breaking the association.
Taking a step back, though, there are delays now up to a minute later delivering content to the home. Which underlines that relative timing is what’s most important. A lesson learnt many years back when VR/AR was first being used in studios where whole sections of the gallery were running several frames delayed to the rest of the facility to account for the processing delay. Today this is more common as is remote production which takes this fixed time offset to the next level. Andy highlights NMOS IS-07 which allows you timestamp button presses and other tally info allowing this type of time-offset working to succeed.
The talk finishes by talking about the work of the GCCG Activity Group at the VSF of which Andy is the co-chair. This group is looking at how to get essences into and out of the cloud. Andy spends some time talking about the tests done to date and the fact that PTP doesn’t exist in the cloud (it may be available for select customers). In fact you may have live with NTP-derived time. Dealing with this is still a lively discussion in progress and Andy is welcoming participants.
Co-Chair, Ground-Cloud-Cloud-Ground Activity Group, VSF
Chief Technologist, Nevion
Subscribe to get daily updates
Views and opinions expressed on this website are those of the author(s) and do not necessarily reflect those of SMPTE or SMPTE Members.
This website is presented for informational purposes only. Any reference to specific companies, products or services does not represent promotion, recommendation, or endorsement by SMPTE