Video: How IP is Revolutionising Sports Video Production

IP Production is very important for sports streaming including esports where its flexibility is a big plus over SDI infrastructure. This panel discusses NDI, SMPTE ST 2110

eSports, in particular, uses many cameras, Point-of-video cameras, PC outputs and the normal camera positions needed to make a good show, so a technology like NDI really helps keeps costs down – since every SDI port is expensive and takes space – plus it allows computer devices to ‘natively’ send video without specific hardware.

NDI is an IP specification from Newtek (now owned by VizRT) which can be licenced for free and is included in Ross, VizRT, Panasonic, OBS, Epiphan and hundreds more. It allows ultra-low-latency video at 100Mbps or low-latency video at 8Mbps.

The panel discusses the right place and use for NDI compared to SDI. In the right places, networking is more convenient as in stadia. And if you have a short distance to run, SDI can often be the best plan. Similarly, until NDI version 4 which includes timing synchronisation, ST 2110 has been a better bet in terms of synchronised video for ISO recordings.

For many events which combine many cameras with computer outputs, whether it be computers playing youtube, Skype or something else, removing the need to convert to SDI allows the production to be much more flexible.

The panel finishes by discussing audio, and taking questions from the floor covering issues such as embedded alpha, further ST 2110 considerations and UHD workflows.

Watch now!
Speakers

Philip Nelson Philip Nelson
President,
Nelco Media
Mark East Mark East
Chief Problem Solver,
090 Media
Victor Borachuk Victor Borachuk
Director/Executive Producer
JupiterReturn
Jack Lave Jack Lavey
Operations Technician,
FloSports
Jon Raidel Jon Raidel
Technical Operations Manager,
NFL Networks

Video: Deploying CMAF In 2019

It’s all very good saying “let’s implement CMAF”, but what’s implemented so far and what can you expect in the real world, away from hype and promises? RealEyes took the podium at the Video Engineering Summit to explain.

CMAF represents an evolution of the tried and tested technologies HLS and DASH. With massive scalability and built upon the well-worn tenants of HTTP, Netflix and a whole industry was born and is thriving on these still-evolving technologies. CMAF stands for the Common Media Application Format because it was created to allow both HLS and DASH to be implemented in one common standard. But the push to reduce latency further and further has resulted in CMAF being better known for it’s low-latency form which can be used to deliver streams with five to ten times lower latencies.

John Gainfort tackles explaining CMAF and highlights all the non-latency-related features before then tackling its low-latency form. We look at what it is (a manfest) and where it came from (ISO BMFF before diving in to the current possibilities and the ‘to do list’ of DRM.

Before the Q&A, John then moves on to how CMAF is implemented to deliver low-latency stream: what to expect in terms of latency and the future items which, when achieved, will deliver the full low-latency experience.

Watch now!

Speaker

John Gainfort John Gainfort.
Development Manager,
RealEyes

Video: Making Live Streaming More ‘Live’ with LL-CMAF

Squeezing streaming latency down to just a few seconds is possible with CMAF. Bitmovin guides us through what’s possible now and what’s yet to come.

CMAF represents an evolution of the tried and tested technologies HLS and DASH. With massive scalability and built upon the well-worn tenants of HTTP, Netflix and a whole industry was born and is thriving on these still-evolving technologies. But the push to reduce latency further and further has resulted in CMAF which can be used to deliver streams with five to ten times lower latencies.

Paul MacDougall is a Solutions Architect with Bitmovin so is well placed to explain the application of CMAF. Starting with a look at what we mean by low latency, he shows that it’s still quite possible to find HLS latencies of up to a minute but more common latencies now are closer to 30 seconds. But 5 seconds is the golden latency which matches many broadcast mechanisms including digital terrestrial, so it’s no surprise that this is where low latency CMAF is aimed.

CMAF itself is simply a format which unites HLS and DASH under one standard. It doesn’t, in and of itself, mean your stream will be low latency. In fact, CMAF was born out of MPEG’s MP4 standard – officially called ISO BMFF . But you can use CMAF in a low-latency mode which is what this talk focusses on.

Paul looks at what makes up the latency of a typical feed discussing encoding times, playback latency and the other key places. With this groundwork laid, it’s time to look at the way CMAF is chunked and formatted showing that the smaller chunk sizes allow the encoder and player to be more flexible reducing several types of latency down to only a few seconds.

In order to take full advantage of CMAF, the play needs to understand CMAF and Paul explains these adaptations before moving on to the limitations and challenges of using CMAF today. One important change, for instance, is that chunked streaming players (i.e. HLS) have always timed the download of each chunk to get a feel for whether bandwidth was plentiful (download was quicker than time taken to play the chunk) or bandwidth was constrained (the chunk arrived slower than real-time). Based on this, the player could choose to increase or decrease the bandwidth of the stream it was accessing which, in HLS, means requesting a chunk from a different playlist. Due to the improvements in downloading smaller chunks and using real-time transfer techniques such as HTTP/1.1 Chunked Transfer the chunks are all arriving at the download speed. This makes it very hard to make ABR work for LL-CMAF, though there are approaches being tested and trialed not mentioned in the talk.

Watch now!

Speakers

Paul MacDougall Paul MacDougall
Solutions Architect,
Bitmovin

Video: Remote Production In Pajamas

Remote production (AKA REMIs) has been discussed for a long time – but what’s practical today? Teradek, Brandlive and Vimond share their experiences making it work.

The main benefit of remote production is reducing costs by keeping staff at base instead of sending them to the event. Switching video, adding graphics and publishing are all possible in the cloud, but how practical this all is and which people stay behind very much depend on the company; their quality standards, their workflows, complexity of the programme etc.

This panel at the Streaming Media East looks at when remote production is appropriate, how much does a service provider needs to be present, redundancy, the role of standards and is a wide ranging discussion on the topic.

Watch now!

Speakers

Jon Landman Jon Landman
VP of Sales,
Teradek
Megan Wagoner Megan Wagoner
VP of Sales,
Vimond
Mark Adams Mark Adams
SVP Sales & Marketing,
Brandlive
Kevin McCarthy Kevin McCarthy
Moderator
Director of Production,
VideoLink LLC