Video: The State of NDI in 2021 & NDI Version 5

Released in 2015, NDI is a royalty-free technology that implements low-latency video, audio, tally and control over gigabit networking. Since 2015, NDI has grown in popularity immensely and is now very widely used within AV, live streaming production and in cloud workflows. Developed by Newtek, now part of VizRT, it allows computers to easily push video from programs onto the local network whether from Teams or Skype, a video Editor, OBS or anything else. Many vendors have taken the NDI binaries and integrated them into their products.

On The Broadcast Knowledge we’ve explored how NDI compliments SMPTE’s ST 2110 suite of standards which primarly help move uncompressed payloads around a broadcast suite. In this panel put on by the IET at IBC we explored the benefits of 2110 and NDI to understand where the overlap lies. And for a deeper dive into NDI including some demos, we also featured this talk by SMPTE and VizRT.

 

 

In today’s video from Key Code Media, we hear from Newtek’s Dr. Andrew Cross, creator of NDI on what’s new in NDI’s latest release, 5.0. Jeff Sengpiehl from Key Code Media explains that NDI 5.0 brings with it improvements in multi-platform support including native support for Apple Silicon. 5.0 also includes plugins for any program to share its audio over NDI as well as a way of sharing a link to get someone to share their video and audio by NDI. Part of the big changes, though in this latest version is the addition of ‘reliable UDP’ and ‘NDI Bridge’. Based on Google’s QUIC this provides a way of sending NDI over lossy networks allowing for it to recover lost data and deal with network congestion. This ties in nicely with ‘NDI Bridge’ which allows two or three separate locations to share their NDI sources and destinations seamlessly. These additional features bring NDI outside of the LAN. Being a layer 2 protocol, it’s always been seen as a local protocol even when deployed in the cloud.

The majority of the video features Dr. Cross answering questions from Jeff and viewers. These questions include:
What are the pain points with audio? Is NDI Audio Direct a replacement for Dante? Maintaining synchronisation in multi-location systems. The significance of support for ARM chips, bidirectional use of NDI, NDI Bridge security, 10-bit colour support, NDI’s place in the ProAV market and the possibility of NDI becoming open source or a standard.

Watch now!
Speakers

Andrew Cross Andrew Cross
Creator of NDI & President of Global Research and Development
VizRT (NewTek/NDI)
Jeff Sengpiehl Moderator: Jeff Sengpiehl
Chief Technologist
Key Code Media

Where can SMPTE 2110 and NDI co-exist?

Our final look back at the most viewed articles of 2020 is a very prescient topic, that of live IP production. As we all know, this has very much come into focus during the pandemic. Those that already had an IP infrastructure found managing it remotely easier than those that needed to get in and move SDI cables when they needed to provision new video links. Moreover putting together live remote workflows is all the easier with video IP and traditionally the decision on whether to use SMPTE 2110 has been a case of whether you need to be in the cloud or not.

This article and video brought together Will Waters, an NDI expert from VizRT, Marc Risby from UK SI Boxer and Willem Vermost who was with the EBU. The conversation, pre-pandemic, focused on how to choose the video-over-IP technology which was best for you and really tried to find the ways in which broadcasters could benefit from both at the same time.

The Broadcast Knowledge Editor, Russell Trafford-Jones also moderated a SMPTE Webcast with VizRT going into the detail of NDI and how it can be deployed in the cloud.

Another important advance in 2020 was AWS’s release of CDI which is an implementation of SMPTE 2110 with enough proprietary adaptations to make it work within AWS. You can hear more about it in this video with David Griggs.

Click here to watch ‘Where can SMPTE 2110 and NDI co-exist?’

Video: Remote Production in the Cloud for DR and the New Normal

How does NDI fit into the recent refocussing of interest in working remotely, operating broadcast workflows remotely and moving workflows into the cloud? Whilst SRT and RIST have ignited imaginations over how to reliably ingest content into the cloud, an MPEG AVC/HEVC workflow doesn’t make sense due to the latencies. NDI is a technology with light compression with latencies low enough to make cloud workflows feel almost immediate.

Vizrt’s Ted Spruill and Jorge Dighero join moderator Russell Trafford-Jones to explore how the challenges the pandemic have thrown up and the practical ways in which NDI can meet many of the needs of cloud workflows. We saw in the talk Where can SMPTE ST 2110 and NDI co-exist? how NDI is a tool to get things done, just like ST 2110 and that both have their place in a broadcast facility. This video takes that as read looks at the practical abilities of NDI both in and out of the cloud.

Taking the of a demo and then extensive Q&A, this talk covers latency, running NDI in the cloud, networking considerations such as layer 2 and layer 3 networks, ease of discovery and routing, contribution into the cloud, use of SRT and RIST, comparison with JPEG XS, speed of deployment and much more!.

Click to watch this no-registration, free webast at SMPTE
Speakers

Jorge Dighero Jorge Dighero
Senior Solutions Architect,
Vizrt
Ted Spruill Ted Spruill
Sales Manager-US Group Stations,
Vizrt
Russell Trafford-Jones Moderator:Russell Trafford-Jones
Editor, TheBroadcastKnowledge.com
Director of Education, Emerging Technologies, SMPTE
Manager, Support & Services, Techex

Video: Live production: Delivering a richer viewing experience

How can large sports events keep an increasingly sophisticated audience entertained and fully engaged? The technology of sports coverage has pushed broadcasting forwards for many years and there’s no change. More than ever there is a convergence of technologies both at the event and delivering to the customers which is explored in this video.

First up is Michael Cole, a veteran of live sports coverage, now working for the PGA European Tour and Ryder Cup Europe. As the event organisers – who host 42 golfing events throughout the year – they are responsible for not just the coverage of the golf, but also a whole host of supporting services. Michael explains that they have to deliver live stats and scores to on-air, on-line and on-course screens, produce a whole TV service for the event-goers, deliver an event app and, of course run a TV compound.

One important aspect of golfing coverage is the sheer distances that video needs to cover. Formerly that was done primarily with microwave links and whilst RF still plays an important part of coverage with wireless cameras, the long distances are now done by fibre. However as this takes time to deploy each time and is hard to conceal in otherwise impeccably presented courses, 5G is seeing a lot of interest to validate its ability to cut rigging time and costs along with making the place look tidier in front of the spectators.

Michael also talks about the role of remote production. Many would see this an obvious way to go, but remote production has taken many years to slowly be adopted. Each broadcaster has different needs so getting the right level of technology available to meet everyone’s needs is still a work in progress. For the golfing events with tens of trucks, and cameras, Michael confirms that remote production and cloud is a clear way forward at the right time.

Next to talk is Remo Ziegler from VizRT who talks about how VizRT serves the live sports community. Looking more at the delivery aspect, they allow branding to be delivered to multiple platforms with different aspect ratios whilst maintaining a consistent look. Whilst branding is something that, when done well, isn’t noticed by viewers, more obvious examples are real-time, photo-realistic rendering for in-studio, 3D graphics. Remo talks next about ‘Augmented Reality’, AR, which can be utilised by placing moving 3D objects into a video making them move and look part of the picture as a way of annotating the footage to help explain what’s happening and to tell a story. This can be done in real time with camera tracking technology which takes into account the telemetry from the camera such as angle of tilt and zoom level to render the objects realistically.

The talk finishes with Chris explaining how viewing habits are changing. Whilst we all have a sense that the younger generation watch less live TV, Chris has the stats showing the change from people 66 years+ for whom ‘traditional content’ comprises 82% of their viewing down to 16-18 year olds who only watch 28%, the majority of the remainder being made up from SCOD and ‘YouTube etc.’.

Chris talks about the newer cameras which have improved coverage both by improving the technical ability of ‘lower tier’ productions but also for top-tier content, adding cameras in locations that would otherwise not have been possible. He then shows there is an increase in HDR-capable cameras being purchased which, even when not being used to broadcast HDR, are valued for their ability to capture the best image possible. Finally, Chris rounds back on Remote Production, explaining the motivations of the broadcasters such as reduced cost, improved work-life balance and more environmentally friendly coverage.

The video finishes with questions from the webinar audience.

Watch now!
Speakers

Michael Cole Michael Cole
Chief Technology Officer,
PGA European Tour & Ryder Cup Europe
Remo Ziegler Remo Ziegler
Vice President, Product Management, Sports,
Vizrt
Chris Evans Chris Evans
Senior Market Analyst,
Futuresource Consulting