For a long time now, broadcasters have been using dark fibre and CWDM (Coarse Wavelength Division Multiplexing) for transmission of multiple SDI feeds to and from remote sites. As an analogue process, WDM is based on a concept called Frequency Division Multiplexing (FDM). The bandwidth of a fibre is divided into multiple channels and each channel occupies a part of the large frequency spectrum. Each channel operates at a different frequency and at a different optical wavelength. All these wavelengths (i.e., colours) of laser light are combined and de-combined using a passive prism and optical filters.
In this presentation Roy Folkman from Embrionix shows what advantages can be achieved by moving from CWDM technology to real-time media-over-IP system. The recent project for CPAC (Cable Public Affairs Channel) in Canada has been used as an example. The scope of this project was to replace an aging CWDM system connecting government buildings and CPAC Studios which could carry 8 SDI signals in each direction with a single dark fibre pair. The first idea was to use a newer CWDM system which would allow up to 18 SDI signals, but quite quickly it became apparent that an IP system could be implemented at similar cost.
As this was an SDI replacement, SMPTE ST 2022-6 was used in this project with a upgrade path to ST 2110 possible. Roy explains that, from CPAC point of view, using ST 2022-6 was a comfortable first step into real-time media-over-IP which allowed for cost reduction and simplification (no PTP generation and distribution required, re-use of existing SDI frame syncs and routing with audio breakaway capability). The benefits of using IP were: increased capacity, integrated routing (in-band control) and ease of future expansion.
A single 1RU 48-port switch on each side and a single dark fibre pair gave the system a capacity of 48 HD SDI signals in each direction. SFP gateways with small Embronix enclosures have been used to convert SDI outs of cameras to IP fibre – that also allowed to extend the distance between the cameras and the switch above SDI cabling limit of 100 meters. SFP gateway modules converting IP to SDI have been installed directly in the switches in both sites.
Roy finishes his presentation with possible future expansion of the system, such as migration to ST 2110 (firmware upgrade for SFP modules), increased capacity (by adding additional dark fibres ands switches), SDI and IP routing integration with unified control system (NMOS), remote camera control and addition of processing functions to SFP modules (Multiviewers, Up/Down/CrossConversion, Compression).
The SMPTE ST 2110-40 standard specifies the real-time, RTP transport of SMPTE ST 291-1 Ancillary Data packets. It allows creation of IP essence flows carrying the VANC data familiar to us from SDI (like AFD, closed captions or ad triggering), complementing the existing video and audio portions of the SMPTE ST 2110 suite.
This presentation, by Bill McLaughlin from EEG, is an updated tutorial on subtitling, closed captioning, and other ancillary data workflows using the ST 2110-40 standard. Topics include synchronization, merging of data from different sources and standards conversion.
Building on Bill’s previous presentation at the IP Showcase), this talk at NAB 2019 demonstrates a big increase in the number of vendors supporting ST 2110-40 standard. Previously a generic packet analyser like Wireshark with dissector was recommended for troubleshooting IP ancillary data. But now most leading multiviewer / analyser products can display captioning, subtitling and timecode from 2110-40 streams. At the recent “JT-NM Tested Program” event 29 products passed 2110-40 Reception Validation. Moreover, 27 products passed 2110-40 Transmitter Validation which mean that their output can be reconstructed into SDI video signals with appropriate timing and then decoded correctly.
Bill points out that ST 2110-40 is not really a new standard at this point, it only defines how to carry ancillary data from the traditional payloads over IP. Special care needs to be taken when different VANC data packets are concatenated in the IP domain. A lot of existing devices are simple ST 2110-40 receivers which would require a kind of VANC funnel to create a combined stream of all the relevant ancillary data, making sure that line numbers and packet types don’t conflict, especially when signals need to be converted back to SDI.
There is a new ST 2110-41 standard being developed for additional ancilary data which do not match up with ancillary data standardised in ST 291-1. Another idea discussed is to move away from SDI VANC data format and use a TTML track (Timed Text Markup Language – textual information associated with timing information) to carry ancillary information.
With all the talk of IP, you’d be wrong to think SDI is dead. 12G for 4K is alive and well in many places, so there’s plenty of appetite to understand how it works and how to diagnose problems.
In this double-header, Steve Holmes from Tektronix takes us through the ins and outs of HDR and also SDI for HDR at the SMPTE SF section.
Steve starts with his eye on the SMPTE standards for UHD SDI video looking at video resolutions and seeing that a UHD picture can be made up of 4 HD pictures which gives rise to two well known formats ‘Quad split’ and ‘2SI’ (2 Sample Interleave).
Colour is the next focus and a discussion on the different colour spaces that UHD is delivered with (spoiler: they’re all in use), what these look like on the vector scope and look at the different primaries. Finishing up with a roundup and a look at interlink timing, there’s a short break before hitting the next topic…HDR
High Dynamic Range is an important technology which is still gaining adoption and is often provided in 4K programmes. Steve defines the two places HDR is important; in the acquisition and the display of the video then provides a handy lookup table of terms such as HDR, WCG, PQ, HDR10, DMCVT and more.
Steve gives us a primer on what HDR is in terms of brightness ‘NITS’, how these relate to real life and how we talk about it with respect to the displays. We then look at HDR on the waveform monitor and look at features of waveform monitors which allow engineers to visualise and check HDR such as false colour.
The topic of gamma, EOTFs and colour spaces comes up next and is well explained building on what came earlier. Before the final demo and Q&A, Steve talks about different ways to grade pictures when working in HDR.
To the uninitiated, it’s not obvious how to send video over IP, what things are important to think about and how close it is to an analogue/SDI signal. Fortunately, Ed Calverly has this excellent tutorial on the basics needed to understand uncompressed video across the board.
This presentation from the IBC 2018 IP Showcase examines the need for timing, a reminder of what ‘blanking’ is and how this is treated in the over-IP world. Discussion of blanking wouldn’t be complete without a discussion of ancillary data (VANC, HANC, DPI, Embedded audio etc.) Whilst blanking was essential in analogue video and is filled with data in SDI, there is a benefit in breaking the signal up into its component parts: video, audio and ancillary data – not least removing upto 30% of dead space; blanking takes bitrate!
Now that Ed’s established the key points of the video which need to be transported, how and where they exist, it’s time to look at how to actually get the data on the network. To do this Ed presents a very accessible explanation of IP discussing how we can split up any message into packets and how we add headers to the packets to ensure they go to the right place. This leads on to a discussion of UDP and TCP, both ways of launching traffic onto a network but with their own pros and cons.
This builds into an examination of subnets, routing and multicast. Whilst these sound fairly academic – and to be clear they can be – they are also essential to a well-founded understanding of the topic and are useful day-to-day when working with SMPTE ST 2110 and SMPTE ST 2022-6 systems. Both of these terms are also explained by Ed along with and comparison of SDI timing (usually black and burst, or tri-Level sync) and PTP timing which is used for IP systems. For more detail on PTP, have a look at this talk, or this one also from the IP Showcase
Wrapping up by talking about the important topic of packet timing called ‘traffic shaping’, we see how important it is to ensure that each packet is equally spaced to avoid problems with buffers on receiving equipment or even within the network itself.
Professional Trainer for the Broadcast Industry,
Subscribe to get daily updates
Views and opinions expressed on this website are those of the author(s) and do not necessarily reflect those of SMPTE or SMPTE Members.
This website is presented for informational purposes only. Any reference to specific companies, products or services does not represent promotion, recommendation, or endorsement by SMPTE