Why would you decode 12Gbps of data in order to monitor for an occasional SCTE switching signal or closed captions? SMPTE 2110 separates video, audio and metadata freeing up workflows to be more flexible. But so far, SMPTE 2110 only defines how to take the metadata used in SDI under SMPTE 291. The point of freeing up workflows is to allow innovation, so this talk looks at SMPTE ST 2110-41 which allows for more types of data to be carried, time synchronised with the other essences.
Paul Briscoe joins us at the podium to explain why we need to extend SMPTE ST 2110-40 beyond SDI-compatible metadata. He starts by explaining how ST 2110 was intended to work well with SDI. Since SDI already has a robust system for Ancillary Data (ANC Data), it made sense for 2110 to support that first in a 100% compatible way. One of the key characteristics of metadata is that it’s synchronised to the other media and we expect the decoders to be selective over what it acts on, explains Paul. The problem that he sees, however is that if you wanted to send some XML, JSON or other data which has never been included in the SDI signal, there is no way to send that within 2110 and keep the synchronisation. To prove his point, Paul puts up the structure of ST-291M in 2110-40 which still has references to lines and horizontal offsets. Furthermore, he points out that future 2110 workflows aren’t going to need to be tied to SDI-compatible metadata.
The Fast Metadata (FMX) proposal is to allow arbitrary metadata streams which can be time-aligned with a media stream. It would provide a standardised encoding method, be extensible, minimise IP address use and retain the option of being interoperable with ST 291 if needed.
Having chosen the KLV data structure, which is well known and scalable, SMPTE provides for this to be delivered with a timestamp and even early so that processing can be done ahead of time. This opens the door to carrying XMl, JSON and other data structure.
Paul explains how this has been implemented as an RTP stream hence using the RTP timestamps. Within the stream there are blobs of data. Paul explains the structure of a blob and how payloads which are smaller than a blob, as well as those which are larger, can be dealt with. Buffering and SDPs need to be supported, after all, this is a 2110 stream.
After doing into the detail of 2110-41, Paul explains there is a 2110-42 standard which can carry technical metadata about the stream. It provides in-band messaging of important stream details but is not intended to replace IS-04 or SDPs.