Multicast ABR is a mix of two very beneficial technologies which are seldom seen together. ABR – Adaptive Bitrate – allows a player to change the bitrate of the video and audio that it’s playing to adapt to changing network conditions. Multicast is a network technology which efficiently sends a video stream over the network without duplicating bandwidth.
ABR has traditionally been deployed for chunk-based video like HLS where each client downloads its own copy of the video in blocks of several seconds in length. This means that you bandwidth you use to distribute your video increases by one thousand times if 1000 people play your video.
Multicast works with live streams, not chunks, but allows the bandwidth use for 1000 players to increase – in the best case – by 0%.
Here, the panelists look at the benefits of combining multicast distribution of live video with techniques to allow it to change bitrate between different quality streams.
This type of live streaming is actually backwards compatible with old-style STBs since the video sent is a live transport stream, it’s possible to deliver that to a legacy STB using a converter in the house at the same time as delivering a better, more modern delivery to other TVs and devices.
It thus also allows pure-streaming providers to compete with conventional broadcast cable providers and can also result in cost savings in equipment provided but also in bandwidth used.
There’s lots to unpack here, which is why the Streaming Video Alliance have put together this panel of experts.
Ericsson’s Netherlands CTO, Jeroen Buijis talks us about the past, present and future of 5G. With an increasing number of operators wanting to deploy OTT/IPTV solutions to homes using wireless technologies and with cities increasingly densely populated, 5G is seen as a key way to enhance services and internet access in both urban and rural areas.
Covering the pros and cons of different network delivery technologies, this talk goes on to cover
This talk covers:
How available bandwidth relates to video streaming use
Date: Thursday November 30, 2017 – Ample Refreshments from 18:15 GMT for 19:00 start. Location: Ericsson Television, Strategic Park, Comines Way, Hedge End, Southampton, SO30 4DA. Google Maps
With higher resolution, wider colour gamut and extended dynamic range, the new Ultra High Definition TV (UHD) standards define a container which allows content creators to offer the consumer a much more immersive visual experience. However there are some artefacts noted within the container particularly around HDR material. Olie Bauman outlines why YCrCb are used and the human vision systems response to changes in chroma/luminance and the correlation between R, G and B
As HDR and WCG expand the Colour Volumes he will show why these increased from SD (601) to HD (709) to UHD (2020) and show the difference between PQ (Display Referred) and HLG (Scene Referred) workflows
From this background he will show examples of artefacts due to chroma down-sampling and show the different characteristics – depending on work flow.
He highlights that the problems will become greater as more content exploiting the full UHD container becomes available, requiring additional care and processing in content production and delivery.
Chaired by John Ive of the IABM, this meeting looks at 5G which officially launches in 2020. How will it change the media landscape?
More bandwidth and network performance is welcome and could change once again how we view and use mobile devices and even outperform fixed installations. There are challenges with the requirement of a new network infrastructure which raises the question of the speed of penetration.
Discussing this are
Steve Plunkett, Chief Technology Officer, Ericsson