Video: 5 PTP Implementation Challenges & Best Practices

PTP is an underlying technology enabling the whole SMPTE 2110 uncompressed ecosystem to work. Using PTP, the Precision Time Protocol, the time a frame of video, audio etc. was captured is recorded and so when decoded can be synchronised with other media recorded around that same time. Though parts of 2110 can function without it, when it comes to bringing media together which need synchronisation, vision mixing for instance, PTP is the way to go.

PTP is actually a standard for time distribution which, like its forerunner NTP, was developed by the IEEE and is a cross-industry standard. Now on version IEEE-1588-2019, it defines not only how to send time onto a network, but also how a receiver can work out what the time actually is. Afterall, if you had a letter in the post telling you the time, you’d know that time – and date for that matter – was old. PTP defines a way of working out how long the letter took to arrive so that you can know the date and time based on the letter and you new-found knowledge of the delivery time.

Knowing the time of day is all very well, but to truly synchronise media, SMPTE ST 2059 is used to interpret PTP for professional media. Video and audio are made from repeating data structures. 2059 relates these repeating data structures back to a common time in the past so that at any time in the future, you can calculate the phase of the signal.

Karl Khun from Tektronix starts by laying out the problems to be solved, such as managing jitter and the precision needed. This leads in into a look at how timestamps are used to make a note of when, separately, video and audio were captured. The network needed to implement PTP, particularly for redundancy and the ability of GPS allowing buildings to be co-timed without being connected.

Troubleshooting PTP will be tricky for many, but learning the IT side of this is only part of the solution. Karl looks at some best practices and tips on faultfinding PPT errors which leads on to a discussion of PTP domains and profiles. An important aspect of PTP is that it is bi-directional. Not only that but it’s much more than a distribution of a signal like the previous black and burst infrastructure. It is a system which needs to be managed and deserves to be monitored. Karl shows how graphs can help show the stability of the network and how RTP/CC errors can show network packet losses/corruptions.

Watch now!
Speakers

Karl Kuhn Karl J. Khun
Principal Solutions Architect
Telestream/Tekronix

Video: The Basics of SMPTE ST 2110 in 60 Minutes

SMPTE ST 2110 is a growing suite of standards detailing uncompressed media transport over networks. Now at 8 documents, it’s far more than just ‘video over IP’. This talk looks at the new ways that video can be transported, dealing with PTP timing, creating ‘SDPs’ and is a thorough look at all the documents.

Building on this talk from Ed Calverley which explains how we can use networks to carry uncompressed video, Wes Simpson goes through all the parts of the ST 2110 suite explaining how they work and interoperate as part of the IP Showcase at NAB 2019.

Wes starts by highlighting the new parts of 2110, namely the overview document which gives a high level overview of all the standard docs, the addition of compressed bit-rate video carriage and the recommended practice document for splitting a single video and sending it over multiple links; both of which are detailed later in the talk.

SMPTE ST 2110 is fundamentally different, as highlighted next, in that it splits up all the separate parts of the signal (i.e. video, audio and metadata) so they can be transferred and processed separately. This is a great advantage in terms of reading metadata without having to ingest large amounts of video meaning that the networking and processing requirements are much lighter than they would otherwise be. However, when essences are separated, putting them back together without any synchronisation issues is tricky.

ST 2110-10 deals with timing and knowing which packets of one essence are associated with packets of another essence at any particular point in time. It does this with PTP, which is detailed in IEEE 1588 and also in SMPTE ST 2059-2. Two standards are needed to make this work because the IEEE defined how to derive and carry timing over the network, SMPTE then detailed how to match the PTP times to phases of media. Wes highlights that care needs to be used when using PTP and AES67 as the audio standard requires specific timing parameters.

The next section moves into the video portion of 2110 dealing with video encapsulation on the networks pixel grouping and the headers needed for the packets. Wes then spends some time walking us through calculating the bitrate of a stream. Whilst for most people using a look-up table of standard formats would suffice, understanding how to calculate the throughput helps develop a very good understanding of the way 2110 is carried on the wire as you have to take note not only of the video itself (4:2:2 10 bit, for instance) but also the pixel groupings, UDP, RTP and IP headers.

Timing of packets on the wire isn’t anything new as it is also important for compressed applications, but it is of similar importance to ensure that packets are sent properly paced on wire. This is to say that if you need to send 10 packets, you send them one at a time with equal time between them, not all at once right next to each other. Such ‘micro bursting’ can cause problems not only for the receiver which then needs to use more buffers, but also when mixed with other streams on the network it can affect the efficiency of the routers and switches leading to jitter and possibly dropped packets. 2110-21 sets standards to govern the timing of network pacing for all of the 2110 suite.

Referring back to his warning earlier regarding timing and AES67, Wes now goes into detail on the 2110-30 standard which describes the use of audio for these uncompressed workflows. He explains how the sample rates and packet times relate to the ability to carry multiple audios with some configurations allowing 64 audios in one stream rather than the typical 8.

‘Essences’, rather than media, is a word often heard when talking about 2110. This is an acknowledgement that metadata is just as important as the media described in 2110. It’s sent separately as described by 2110-40. Wes explains the way captions/subtitles, ad triggers, timecode and more can be encapsulated in the stream as ancillary ‘ANC’ packets.

2110-22 is an exciting new addition as this enables the use of compressed video such as VC-2 and JPEG-XS which are ultra low latency codecs allowing the video stream to be reduced by half, a quarter or more. As described in this talk the ability to create workflows on a single IP infrastructure seamlessly moving into and out of compressed video is allowing remote production across countries allowing for equipment to be centralised with people and control surfaces elsewhere.

Noted as ‘forthcoming’ by Wes, but having since been published, is RP 2110-23 which adds back in a feature that was lost when migrating from 2022-6 into 2110 – the ability to send a UHD feed as 4x HD feeds. This can be useful to allow for UHD to be used as a production format but for multiviewers to only need to work in HD mode for monitoring. Wes explains the different modes available. The talk finishes by looking at RTP timestamps and SDPs.

Watch now!
The slides for this talk are available here
Speakers

Wes Simpson Wes Simpson
President,
Telecom Product Consulting

Video: BBC Cardiff Central Square – Update

It’s being closely watched throughout the industry, a long-in-the-making project to deploy SMPTE ST 2110 throughout a fully green-field development. Its failure would be a big setback for the push to a completely network-based broadcast workflow.

The BBC Cardiff Central Square project is nearing completion now and is a great example of the early-adopter approach to bringing cutting-edge, complex, large-scale projects to market. They chose a single principle vendor so that they could work closely in partnership at a time when the market for ST 2110 was very sparse. This gave them leverage over the product roadmap and allowed to the for the tight integration which would be required to bring this project to market.

Nowadays, the market for ST 2110 products continues to mature and whilst it has still quite a way to go, it has also come a long way in the past four years. Companies embarking similar projects now have a better choice of products and some may now feel they can start to pick ‘best of breed’ rather than taking the BBC approach. Whichever approach is taken there is still a lot to be gained by following and learning from the mistakes and successes of others. Fortunately, Mark Patrick, Lead Architect on the project is here to provide an update on the project.

Mark starts by giving and overview of the project, its scale and its aims. He presents the opportunities and challenges it presents and the key achievements and milestones passed to date.

Live IP has benefits and risks. Mark takes some time to explain the benefits of the flexibility and increasingly lower cost of the infrastructure and weighs them agains the the risks which include the continually developing standards and skills challenges

The progress overview names Grass Vally as the main vendor, control via BNCS having being designed and virtualised, ST 2110 network topology deployed and now the final commissioning and acceptance testing is in progress.

The media topology for the system uses an principal of an A and a B network plus a separate control network. It’s fundamentally a leaf and spine network and Mark shows how this links in to both the Grass Valley equipment but also the audio equipment via Dante and AES67. Mark takes some time to discuss the separate networks they’ve deployed for the audio part of the project, driven by compatibility issues but also within the constraints of this project, it was better to separate the networks rather than address the changes necessary to force them together.

PTP timing is discussed with a nod to the fact that PTP design can be difficult and that it can be expensive too. NMOS issues are also actively being worked on and remains an outstanding issue in terms of getting enough vendors to support it, but also having compatible systems once an implementation is deployed. This has driven the BBC to use NMOS in a more limited way than desired and creating fall-back systems.

From this we can deduce, if it wasn’t already understood, that interoperability testing is a vital aspect of the project, but Mark explains that formalised testing (i.e. IT-style automated) is really important in creating a uniform way of ensuring problems have been fully addressed and there are no regressions. ST 2110 systems are complex and fault finding can be similarly complex and time consuming.

Mark leaves us by explaining what keeps him awake at night which includes items such as lack of available test equipment, lack of single-stream UHD support and NMOS which leads him to a few comments on ST 2110 readiness such as the need for vendors to put much more effort into configuration and management tools.

Anyone with an interest in IP in broadcast will be very grateful at Mark’s, and the BBC’s, willingness to share the project’s successes and challenges in such a constructive way.

Watch now!

Speaker

Mark Patrick Mark Patrick
Lead Architect,
BBC Major Projects Infrastructure

Video: Implementing AES67 and ST 2110-30 in Your Plant

AES67 is a flexible standard but with this there is complexity and nuance. Implementing it within ST 2110-30 takes some care and this talk covers lessons learnt in doing exactly that.

AES67 is a standard defined by the Audio Engineering Society to enable high-performance audio-over-IP streaming interoperability between various AoIP systems like Dante, WheatNet-IP and Livewire. It provides comprehensive interoperability recommendations in the areas of synchronization, media clock identification, network transport, encoding and streaming, session description, and connection management.

The SMPTE ST 2110 standards suite makes it possible to separately route and break away the essence streams – audio, video, and ancillary data. ST 2110-30 addresses system requirements and payload formats for uncompressed audio streams and refers to the subset of AES67 standard.

In this video Dominic Giambo from Wheatsone Corporation discusses tips for implementing AES67 and ST 2110-30 standards in a lab environment consisting of over 160 devices (consoles, sufraces, hardware and software I/O blades) and 3 different automation systems. The aim of the test was to pass audio through every single device creating a very long chain to detect any defects.

The following topics are covered:

  • SMPTE ST 2110-30 as a subset of AES67 (support of the PTP profile defined in SMPTE ST 2059-2, an offset value of zero between the media clock and the RTP stream clock, option to force a device to operate in PTP slave-only mode)
  • The importance of using IEEE-1588 PTP v2 master clock for accuracy
  • Packet structure (UDP and RTP header, payload type)
  • Network configuration considerations (mapping out IP and multicast addresses for different vendors, keeping all devices on the same subnet)
  • Discovery and control (SDP stream description files, configuration of signal flow from sources to destinations)

Watch now!

You can download the slides here.

Speaker

Dominic Giambo
Senior Embedded Engineer
Wheatstone Corporation