Video: Multicast ABR

Multicast ABR is a mix of two very beneficial technologies which are seldom seen together. ABR – Adaptive Bitrate – allows a player to change the bitrate of the video and audio that it’s playing to adapt to changing network conditions. Multicast is a network technology which efficiently sends a video stream over the network without duplicating bandwidth.

ABR has traditionally been deployed for chunk-based video like HLS where each client downloads its own copy of the video in blocks of several seconds in length. This means that you bandwidth you use to distribute your video increases by one thousand times if 1000 people play your video.

Multicast works with live streams, not chunks, but allows the bandwidth use for 1000 players to increase – in the best case – by 0%.

Here, the panelists look at the benefits of combining multicast distribution of live video with techniques to allow it to change bitrate between different quality streams.

This type of live streaming is actually backwards compatible with old-style STBs since the video sent is a live transport stream, it’s possible to deliver that to a legacy STB using a converter in the house at the same time as delivering a better, more modern delivery to other TVs and devices.

It thus also allows pure-streaming providers to compete with conventional broadcast cable providers and can also result in cost savings in equipment provided but also in bandwidth used.

There’s lots to unpack here, which is why the Streaming Video Alliance have put together this panel of experts.

Watch now and find out more!

Speakers

Phillipe Carol Phillipe Carol
Senior Product Manager,
Anevia
Neil Geary Neil Geary
Technical Strategy Consultant,
Liberty Global
Brian Stevenson Brian Stevenson
VP of Ecosystem Strategy & Partnerships,
Ericsson
Mark Fisher Mark Fisher
VP of Marketing & Business Development,
Qwilt
Jason Thibeault Jason Thibeault
Executive Director,
Streaming Video Alliance

Video: Blockchain & the Hollywood Supply Chain

At The Broadcast Knowledge, we’re continuing to cut through the hype and get to the bottom of blockchain. Now part of the NAB drinking game along with words like AI and 5G, it’s similarly not going away. The principle of blockchain is useful – just not useful everywhere.

So what can broadcasters do with Blockchain, and – given this is a SMPTE talk – what can film studios do with it? It’s doubtless that blockchain really makes secure, trusted systems possible so the mind immediately jumps to using it to ensure all the files needed to create films are distributed securely and with an audit trail.

Here, Steve Wong looks at this but explores the new possibilities this creates. He starts with the basics on what blockchain is and how it works, but soon moves in to how this could work for Hollywood explaining what could exist and what already does.

Speaker

Steve Wong Steve Wong
Cloud & Platform Services General Manager, Telecom, Media & Technology
DXC Technology

Video: Holographic update: Light Fields and the Future of Video

Recording Light Fields sounds like sci-fi as it allows you to record a video and then move around that video as you please changing the angle you look at it and your position. This is why it’s also referred to as holography.

It works by recording the video from many different viewpoints rather than just from one angle. Processing all of these different videos of the same thing allows a computer to build a 3D video model of the scene which you can then watch using VR goggles or a holographic TV.

In this talk from San Francisco Video Tech, Ryan Damm from Visby.io talks us through some of the basics of light fields touching and brings us up to date with the current status. Google, Microsoft, Intel are some of the big players investing in R&D among many smaller startups.

Ryan talks about the need for standardisation for light fields. The things we take for granted in 2D video are compared with what you have with light field video by way of explaining the challenges and approaches being seen today in this active field.

Watch now and learn!

Speaker

Ryan Damm Ryan Damm
Co founder,
Visby

Video: A Basic Guide For Real-Time IP Video

There are a lot of videos looking into the details of uncompressed video over IP, but not many for those still starting out – and let’s face it, there are a lot of people who are only just embarking on this journey. Here, Andy Jones takes us through the real basics do prove very useful as a building block for understanding today’s IP technologies.

Andy Jones is well known by many broadcast engineers in the UK having spent many many years working in The BBC’s Training and Development department and subsequently running training for the IABM. The news that he passed away on Saturday is very saddening and I’m posting this video in recognition of the immense amount he has contributed to the industry through his years of tireless work. You can see from this video from NAB 2018 his passion, energy and ability to make complicated things simple.

In this talk, Andy looks at the different layers that networks operate on, including the physical layer i.e. the cables. This is because the different ways in which traffic gets from A to B in networking are interdependent and need to be considered as such. He looks at an example network which shows all the different standards in use in an IP network and talks about their relevance.

Andy briefly looks at IP addresses and the protocol that makes them work. This underpins much of what happens on most networks before looking at the Real-time Transport Protocol (RTP) which is heavily used for sending audio and video streams.

After looking at how timing is done in IP (as opposed to black and burst) he has laid enough foundations to look at SMPTE ST 2110 – the suite of standards which show how different media (essences) are sent in networks delivering uncompressed streams. AES67 for the audio is also looked at before how to control the whole kit and caboodle.

A great primer for those starting out, watch now!

Speaker

Andy Jones Andy Jones