Video: JPEG XS Interoperability Activity Group Update


JPEG XS is a low-latency, light-compression codec often called a ‘mezzanine’ codec. Encoding within milliseconds, JPEG XS can compress full-bandwidth signals by 4x or more allowing scope for several generations of compression without significant degradation. The low-latency and resilience to de-generation make it ideal for enabling remote production.

John Dale from Media Links joins us to look at what’s being done within the Video Services Forum (VSF) to ensure interoperability. As a new standard, JPEG XS is yet to be or is still being implemented in many companies’ products. Therefore this is the perfect time to be looking at how to standardise interconnects,

Running JPEG XS over MPEG TS is one approach which is being written up in ‘VSF TR-07’ (Technical Reference 7) which will be imminently completed. It defines capabilities for 2K, 4K and 8K video with and without HDR. They have split the video formats into capability sets meaning that a vendor can comply with the specification by stating which subset(s) it can cope with. All formats up to 1080p60 are under capability set ‘A’ with ‘B’ covering UHD resolutions. After this work, they will look at JPEG XS over ST 2110-22 instead of MPEG TS. This is yet to start and will share much of the work from previous work.

Watch now!
Speaker

John Dale John Dale
Company Director and CMO,
Media Links.

Video: IPMX – The Need for a New ProAV Standard

IPMX is an IP specification for interoperating Pro AV equipment. As the broadcast industry is moving towards increasing IP deployments based on SMPTE 2110 and AMWA’s NMOS protocols, there’s been a recognition that the Pro AV market needs to do many of the same things Broadcast wants to do. Moreover, there is not an open standard in Pro AV to achieve this transformation. Whilst there are a number of proprietary alliances, which enable wide-spread use of a single chip or software core, this interoperability comes at a cost and ultimately is underpinned by one, or a group of companies.

Dave Chiappini from Matrox discusses the work of the AIMS Pro AV working group with Wes Simpson from the VSF. Dave underlines the fact that this is a pull to unify the Pro AV industry to help people avoid investing over and over again in reinventing protocols or reworking their products to interoperate. He feels that ‘open standards help propel markets forward’ adding energy and avoiding vendor lock-in. This is one reason for the inclusion of NMOS, allowing any vendor to make a control system by working to the same open specification, opening up the market to both small and large companies.

Dave is the first to acknowledge that the Pro AV market’s needs are different to broadcast’s, and explains that they have calibrated settings, added some and ‘carefully relaxed’ parts of the standards. The aim is to have a specification which allows one piece of equipment, should the vendor wish to design it this way, that can be used in either an IPMX or ST 2110 system. He explains that the idea of relaxing some aspects of the ST 2110 ecosystem helps simplify implementation which therefore reduces cost.

One key relaxation has been in PTP. A lot of time and effort goes into making the PTP infrastructure work properly within SMPTE 2110 infrastructure. Having to do this at an event whilst setting up in a short timespan is not helpful to anyone and, elaborates Dave, a point to point video link simply doesn’t need high precision timing. IPMX, therefore, is lenient in the need for PTP. It will use it when it can, but will gracefully reduce accuracy and, when there is no grandmaster, will still continue to function.

Another difference in the Pro AV market is the need for compression. Whilst there are times when zero compression is needed in both AV and Broadcast, Pro AV needs the ability to throw some preview video out to an iPad or similar. This isn’t going to work with JPEG XS, the preferred ‘minimal compression’ codec for IPMX, so a system for including H264 or H265 is being investigated which could have knock-on benefits for Broadcast.

HDMI is essential for a Pro AV solution and needs its own treatment. Different from SDI, it has lots of resolutions and frame rates. It also has HDCP so AIMS is now working with the DCP on creating a method of carrying HDCP over 2110. It’s thus hoped that this work will help broadcast use cases. TVs are already replacing SDI monitors, such interoperability with HDMI should bring down the costs of monitoring for non-picture critical environments.

Watch now!
Speakers

David Chiappini David Chiappini
Chair, Pro AV Working Group, AIMS
Executive Vice President, Research & Development,
Matrix Graphics Inc.
Wes Simpson Wes Simpson
RIST AG Co-Chair, VSF
President & Founder, LearnIPvideo.com

Video: Line by Line Processing of Video on IT Hardware

If the tyranny of frame buffers is let to continue, line-latency I/O is rendered impossible without increasing frame-rate to 60fps or, preferably, beyond. In SDI, hardware was able to process video line-by-line. Now, with uncompressed SDI, is the same possible with IT hardware?

Kieran Kunhya from Open Broadcast Systems explains how he has been able to develop line-latency video I/O with SMPTE 2110, how he’s coupled that with low-latency AVC and HEVC encoding and the challenges his company has had to overcome.

The commercial drivers are fairly well known for reducing the latency. Firstly, for standard 1080i50, typically treated as 25fps, if you have a single frame buffer, you are treated to a 40ms delay. If you need multiple buffers for a workflow, this soon stacks up so whatever the latency of your codec – uncompressed or JPEG XS, for example – the latency will be far above it. In today’s covid world, companies are looking for cutting the latency so people can work remotely. This has only intensified the interest that was already there for the purposes of remote production (REMIs) in having low-latency feeds. In the Covid world, low latency allows full engagement in conversations which is vital for news anchors to conduct interviews as well as they would in person.

IP, itself, has come into its own during recent times where there has been no-one around to move an SDI cable, being able to log in and scale up, or down, SMPTE ST 2110 infrastructure remotely is a major benefit. IT equipment has been shown to be fairly resilient to supply chain disruption during the pandemic, says Kieran, due to the industry being larger and being used to scaling up.

Kieran’s approach to receiving ST 2110 deals in chunks of 5 to 10 lines. This gives you time to process the last few lines whilst you are waiting for the next to arrive. This processing can be de-encapsulation, processing the pixel values to translate to another format or to modify the values to key on graphics.

As the world is focussed on delivering in and out of unusual and residential places, low-bitrate is the name of the game. So Kieran looks at low-latency HEVC/AVC encoding as part of an example workflow which takes in ST 2110 video at the broadcaster and encodes to MPEG to deliver to the home. In the home, the video is likely to be decoded natively on a computer, but Kieran shows an SDI card which can be used to deliver in traditional baseband if necessary.

Kieran talks about the dos and don’ts of encoding and decoding with AVC and HEVC with low latency targetting an end-to-end budget of 100ms. The name of the game is to avoid waiting for whole frames, so refreshing the screen with I-frame information in small slices, is one way of keeping the decoder supplied with fresh information without having to take the full-frame hit of 40ms (for 1080i50). Audio is best sent uncompressed to ensure its latency is lower than that of the video.

Decoding requires carefully handling the slice boundaries, ensuring deblocking i used so there are no artefacts seen. Compressed video is often not PTP locked which does mean that delivery into most ST 2110 infrastructures requires frame synchronising and resampling audio.

Kieran foresees increasing use of 2110 to MPEG Transport Stream back to 2110 workflows during the pandemic and finishes by discussing the tradeoffs in delivering during Covid.

Watch now!
Speaker

Kieran Kunhya Kieran Kunhya
CEO & Founder, Open Broadcast Systems

Video: Keeping Time with PTP

The audio world has been using PTP for years, but now there is renewed interest thanks to its inclusion in SMPTE ST 2110. Replacing the black and burst timing signal (and for those that used it, TLS), PTP changes the way we distribute time. B&B was a waterfall distribution, PTP is a bi-directional conversation which, as a system, needs to be monitored and should be actively maintained.

Michael Waidson from Telestream (who now own Tektronix) brings us the foundational basics of PTP as well as tips and tricks to troubleshoot your PTP system. He starts by explaining. the types of messages which are exchanged between the clock and the device as well as why all these different messages are necessary. We see that we can set the frequency at which the announce, sync and follow-up messages. The sync and follow-up messages actually contain the time. When a device receives one of these messages, it needs to respond with a ‘delay request’ in order to work out how much of a delay there is between it and the grand master clock. This will result in it receiving a delay response. On top of these basic messages, there is a periodic management message which can contain further information such as daylight savings time or drop-frame information.

Michael moves on to looking at troubleshooting highlighting the four main numbers to check: The domain value, grandmaster ID, message rates and the communication mode. PTP is a global standard used in many industries. To make PTP most useful to the broadcast industry, SMPTE ST 2059 defines values to use for message repetition (4 per second for announce messages, 8 for sync, delay request and delay response). ST 2059 also defines how devices can determine the phase of any broadcast signal for any given time which is the fundamental link needed to ensure all devices keep synchronicity.

Another good tip from Michael is if you see the grandmaster MAC changing between the grandmasters on the system, this indicates it’s no receiving any announce messages so is initiating the Best Master Clock Algorithm (BMCA) and trying the next grandmaster. Some PTP monitoring equipment including from Meinberg and from Telestream can show the phase lag of the PTP timing as well as the delay between the primary and secondary grandmaster – the lower the better.

A talk on PTP can’t avoid mentioning boundary clocks and transparent switches. Boundary clocks take on much of the two-way traffic in PTP protecting the grandmasters from having to speak directly to all the, potentially, thousands of devices. Transparent switches, simply update the time announcements with the delay for the message to move through the switch. Whilst this is useful in keeping the timing accurate, it provides no protection for the grandmasters. He finishes video ends with a look at how to check PTP messages on the switch.

Watch now!
Speakers

Michael Waidson Michael Waidson
Application Engineer
Telestream (formerly Tektronix)