Video: Native Processing of Transport Streams to/from Uncompressed IP

As much as the move to IP hasn’t been trivial for end-users, it’s been all the harder for vendors who have had to learn all the same lessons as end-users, but also press the technology into action. Whilst broadcast is building on the expertise, success and scale of the IT industry, we are also pushing said technology to its limits and, in some cases, in ways not yet seen by the IT industry at large.

Kieran Kunhya from encoder and decoder vendor Open Broadcast Systems, explains to us the problems faced in making this work for software-based systems. As we heard earlier this week on The Broadcast Knowledge, the benefits of moving functions away from bespoke hardware are the ability to move your workflows more easily into data centres or even the cloud. Indeed, flexibility is one important factor for OBS which is why they are a software-first company. Broadcast workflows have been traditionally static and still, today, tends to only do one thing so a move to software removes the dependence on specific, custom chips.

The move to IP has many benefits, as Kieran outlines next. In today’s pandemic, a big benefit is simply not needing a person to go and move an SDI cable. But freeing ourselves from SDI, we hear, is more than just that. Kieran acknowledges that SDI achieves ultra-low delay in the realm of microseconds to move gigabits of video, but this comes at a high price. Each cable only carries one signal and only in one direction, but more critically routers top out at 1152×1152 in size. Whilst this does seem like a large number, larger operators are finding this is is simply not enough as they continue to both expand their offerings and also merge (compare Comcast’s NBC and Sky businesses).

The industry, by looking towards higher bandwidth and more scalable technologies for video has solved many of these problems. The bandwidth routing capability of IT switches can be in the terabits with each port being 100 or 400Gbps. Each cable works bidirectionally and, typically, carries multiple signals. This not leaves the infrastructure future-proof to moves, say, to 8K video but enables much denser routing of signals well above 1152×1152. The result of Kieran’s work is 64 channel encoding/decoding in 2U which can replace up to a full rack of traditional equipment.

This success hasn’t come without a lot of work. The timings are very tight and getting standard servers to deliver 100% of packets onto a network within 20 microseconds takes hard-won knowledge. Kieran explains that two of the keys to success are using kernel bypass techniques where he’s able to write directly into the memory space the NIC uses rather than the traditional method which would take the data via the Linux kernel. Secondly, he uses SIMD CPU instructions directly. This can speed up code by up to twenty times compared to C and only needs to be done once per CPU generation.

Once these techniques are harnessed, OBS still has to deal with the variety of unusual pixel formats, the difficulty of reference counting with many small buffers, uncompressed audio which has low bitrate and short 125 microsecond packets. Coupled with other equipment which doesn’t verify checksums, doesn’t use timestamps and doesn’t necessarily hadn’t 16 channel flows, making this work is tough but Kieran’s very clear the benefits of uncompressed IP video are worth it.

Watch now!
Speakers

Kieran Kunhya Kieran Kunhya
Founder & CEO
Open Broadcast Systems

Video: IP-based Networks for UHD and HDR Video

If you get given a video signal, would you know what type it was? Life used to be simple, an SD signal would decode in a waveform monitor and you’d see which type it was. Now, with UHD and HDR, this isn’t all the information you need. Arguably this gets easier with IP and is possibly one of the few things that does. This video from AIMS helps to clear up why IP’s the best choice for UHD and HDR.

John Mailhot from Imagine Communications joins Wes Simpson from LearnIPVideo.com to introduce us to the difficulties wrangling with UHD and HDR video. Reflecting on the continued improvement of in-home displays’ ability to show brighter and better pictures as well as the broadcast cameras’ ability to capture much more dynamic range, John’s work at Imagine is focussed on helping broadcasters ensure their infrastructure can enable these high dynamic range experiences. Streaming services have a slightly easier time delivering HDR to end-users as they are in complete control of the distribution chain whereas often in broadcast, particularly with affiliates, there are many points in the chain which need to be HDR/UHD capable.

John starts by looking at how UHD was implemented in the early stages. UHD, being twice the horizontal and twice the vertical resolution of HD is usually seen as 4xHD, but, importantly, John points out that this is true for resolution but, as most HD is 1080i, it also represents a move to 1080p, 3Gbps signals. John’s point is that this is a strain on the infrastructure which was not necessarily tested for initially. Given the UHD signal, initially, was carried by four cables, there is now 4 times the chance of a signal impairment due to cabling.

Square Division Multiplexing (SQD) is the ‘most obvious’ way to carry UHD signals with existing HD infrastructure. The picture is simply cut into four quarters and each quarter is sent down one cable. The benefit here is that it’s easy to see which order the cables need to be connected to the equipment. The downsides included a frame-buffer delay (half a frame) each time the signal was received, difficulties preventing drift of quadrants if they were treated differently by the infrastructure (i.e. there was a non-synced hand-off). One important problem is that there is no way to know an HD feed is from a UHD set or just a lone 3G signal.

2SI, two-sample interleave, was another method of splitting up the signal which was standardised by SMPTE. This worked by taking a pair of samples and sending them down cable 1, then the next pair down cable 2, the pair of samples under the first pair went down cable 3 then the pair under 2 went down 4. This had the happy benefit that each cable held a complete picture, albeit very crudely downsampled. However, for monitoring applications, this is a benefit as you can DA one feed and send this to a monitor. Well, that would have been possible except for the problem that each signal had to maintain 400ns timing with the others which meant DAs often broke the timing budget if they reclocked. It did, however, remove the half-field latency burden which SQD carries. The main confounding factor in this mechanism is that looking at the video from any cable on a monitor isn’t enough to understand which of the four feeds you are looking at. Mis-cabling equipment leads to subtle visual errors which are hard to spot and hard to correct.

Enter the VPID, the Video Payload ID. SD SDI didn’t require this, HD often had it, but for UHD it became essential. SMPTE ST 425-5:2019 is the latest document explaining payload ID for UHD. As it’s version five, you should be aware that older equipment may not parse the information in the correct way a) as a bug and b) due to using an old standard. The VPID carries information such as interlaced/progressive, aspect ratio, transfer characteristics (HLG, SDR etc.), frame rate etc. John talks through some of the common mismatches in interpretation and implementation of VPID.

12G is the obvious baseband solution to the four-wires problem of UHD. Nowadays the cost of a 12G transceiver is only slightly more than 3G ones, therefore 12G is a very reasonable solution for many. It does require careful cabling to ensure the cable is in good condition and not too long. For OB trucks and small projects, 12G can work well. For larger installations, optical connections are needed, one signal per fibre.

The move to IP initially went to SMPTE ST 2022-6, which is a mapping of SDI onto IP. This meant it was still quite restrictive as we were still living within the SDI-described world. 12G was difficult to do. Getting four IP streams correctly aligned, and all switched on time, was also impractical. For UHD, therefore SMPTE ST 2110 is the natural home. 2110 can support 32K, so UHD fits in well. ST 2110-22 allows use of JPEG XS so if the 9-11Gbps bitrate of UHDp50/60 is too much it can be squeezed down to 1.5Gbps with almost no latency. Being carried as a single video flow removes any switch timing problems and as 2110 doesn’t use VPID, there is much more flexibility to fully describe the signal allowing future growth. We don’t know what’s to come, but if it’s different shapes of video rater, new colour spaces or extensions needed for IPMX, these are possible.

John finishes his conversation with Wes mentioning two big benefits of moving to IT-based infrastructure. One is the ability to use the free Wireshark or EBU List tools to analyse video. Whilst there are still good reasons to buy test equipment, the fact that many checks can be done without expensive equipment like waveform monitors is good news. The second big benefit is that whilst these standards were being made, the available network infrastructure has moved from 25 to 100 to 400Gbps links with 800Gbps coming in the next year or two. None of these changes has required any change in the standards, unlike with SDI where improvements in signal required improvements in baseband. Rather, the industry is able to take advantage of this new infrastructure with no effort on our part to develop it or modify the standards.

Watch now!
Speakers

John Mailhot John Mailhot
Systems Architect, IP Convergence,
Imagine Communications
Wes Simpson Wes Simpson
RIST AG Co-Chair, VSF
President & Founder, LearnIPvideo.com

Video: Line by Line Processing of Video on IT Hardware

If the tyranny of frame buffers is let to continue, line-latency I/O is rendered impossible without increasing frame-rate to 60fps or, preferably, beyond. In SDI, hardware was able to process video line-by-line. Now, with uncompressed SDI, is the same possible with IT hardware?

Kieran Kunhya from Open Broadcast Systems explains how he has been able to develop line-latency video I/O with SMPTE 2110, how he’s coupled that with low-latency AVC and HEVC encoding and the challenges his company has had to overcome.

The commercial drivers are fairly well known for reducing the latency. Firstly, for standard 1080i50, typically treated as 25fps, if you have a single frame buffer, you are treated to a 40ms delay. If you need multiple buffers for a workflow, this soon stacks up so whatever the latency of your codec – uncompressed or JPEG XS, for example – the latency will be far above it. In today’s covid world, companies are looking for cutting the latency so people can work remotely. This has only intensified the interest that was already there for the purposes of remote production (REMIs) in having low-latency feeds. In the Covid world, low latency allows full engagement in conversations which is vital for news anchors to conduct interviews as well as they would in person.

IP, itself, has come into its own during recent times where there has been no-one around to move an SDI cable, being able to log in and scale up, or down, SMPTE ST 2110 infrastructure remotely is a major benefit. IT equipment has been shown to be fairly resilient to supply chain disruption during the pandemic, says Kieran, due to the industry being larger and being used to scaling up.

Kieran’s approach to receiving ST 2110 deals in chunks of 5 to 10 lines. This gives you time to process the last few lines whilst you are waiting for the next to arrive. This processing can be de-encapsulation, processing the pixel values to translate to another format or to modify the values to key on graphics.

As the world is focussed on delivering in and out of unusual and residential places, low-bitrate is the name of the game. So Kieran looks at low-latency HEVC/AVC encoding as part of an example workflow which takes in ST 2110 video at the broadcaster and encodes to MPEG to deliver to the home. In the home, the video is likely to be decoded natively on a computer, but Kieran shows an SDI card which can be used to deliver in traditional baseband if necessary.

Kieran talks about the dos and don’ts of encoding and decoding with AVC and HEVC with low latency targetting an end-to-end budget of 100ms. The name of the game is to avoid waiting for whole frames, so refreshing the screen with I-frame information in small slices, is one way of keeping the decoder supplied with fresh information without having to take the full-frame hit of 40ms (for 1080i50). Audio is best sent uncompressed to ensure its latency is lower than that of the video.

Decoding requires carefully handling the slice boundaries, ensuring deblocking i used so there are no artefacts seen. Compressed video is often not PTP locked which does mean that delivery into most ST 2110 infrastructures requires frame synchronising and resampling audio.

Kieran foresees increasing use of 2110 to MPEG Transport Stream back to 2110 workflows during the pandemic and finishes by discussing the tradeoffs in delivering during Covid.

Watch now!
Speaker

Kieran Kunhya Kieran Kunhya
CEO & Founder, Open Broadcast Systems

Video: Is IP Really Better than SDI?

Is SDI so bad? With the industry as a whole avidly discussing and developing IP technology, all the talk of the benefits of IP can seem like a dismissal of SDI. SDI served the broadcast industry very well for decades, so what’s suddenly so wrong with it? Of course, SDI still has a place and even some benefits over IP. Whilst IP is definitely a great technology to take the industry forward, there’s nothing wrong with using SDI in the right place.

Ed Calverley from Q3Media takes an honest look at the pros and cons of SDI. Not afraid to explain where SDI fits better than IP, this is a very valuable video for anyone who has to choose technology for a small or medium project. Whilst many large projects, nowadays, are best done in IP, Ed looks at why that is and, perhaps more importantly, what’s tricky about making it work, highlighting the differences doing the same project in SDI.

This video is the next in IET Media’s series of educational videos and follows on nicely from Gerard Phillips’ talk on Network Design for uncompressed media. Here, Ed recaps on the reasons SDI has been so successful and universally accepted in the broadcast industry as well as looking at SDI routing. This is essential to understand the differences when we move to IP in terms of benefits and compromises.

SDI is a unidirectional technology, something which makes it pleasantly simple, but at scale makes life difficult in terms of cabling. Not only is it unidirectional, but it can only really carry one video at a time. Before IP, this didn’t seem to be much of a restriction, but as densities have increased, cabling was often one limiting factor on the size of equipment – not unlike the reason 3.5mm audio jacks have started to disappear from some phones. Moreover, anyone who’s had to plan an expansion of an SDI router, adding a second one, has come across the vast complexity of doing so. Physically it can be very challenging, it will involve using tie-lines which come with a whole management overhead in and of themselves as well as taking up much valuable I/O which could have been used for new inputs and outputs, but are required for tying the routers together. Ed uses a number of animations to show how IP significantly improves media routing,

In the second part of the video, we start to look at the pros and cons of key topics including latency, routing behaviour, virtual routing, bandwidth management, UHD and PTP. With all this said, Ed concludes that IP is definitely the future for the industry, but on a project-by-project basis, we shouldn’t dismiss the advantages that do exist of SDI as it could well be the right option.

Watch now!
Speakers

Ed Ed Calverley
Trainer & Consultant
Q3Media.co.uk
Russell Trafford-Jones Moderator: Russell Trafford-Jones
Exec Member, IET Media Technical Network
Editor, The Broadcast Knowledge
Manager, Services & Support, Techex