Measuring video quality is done daily around the world between two video assets. But what happens when you want to take the aggregate quality of a whole manifest? With VMAF being a well regarded metric, how can we use that in an automatic way to get the overview we need?
In this talk, Nick Chadwick from Mux shares the examples and scripts he’s been using to analyse videos. Starting with an example where everything is equal other than quality, he explains the difficulties in choosing the ‘better’ option when the variables are much less correlated. For instance, Nick also examines the situations where a video is clearly better, but where the benefit is outweighed by the minimal quality benefit and the disproportionately high bitrate requirement.
So with all of this complexity, it feels like comparing manifests may be a complexity too far, particularly where one manifest has 5 renditions, the other only 4. The question being, how do you create an aggregate video quality metric and determine whether that missing rendition is a detriment or a benefit?
Before unveiling the final solution, Nick makes the point of looking at how people are going to be using the service. Depending on the demographic and the devices people tend to use for that service, you will find different consumption ratios for the various parts of the ABR ladder. For instance, some services may see very high usage on 2nd screens which, in this case, may take low-resolution video and also lot of ‘TV’ size renditions at 1080p50 or above with little in between. Similarly other services may seldom ever see the highest resolutions being used, percentage-wise. This shows us that it’s important not only to look at the quality of each rendition but how likely it is to be seen.
To bring these thoughts together into a coherent conclusion, Nick unveils an open-source analyser which takes into account not only the VMAF score and the resolution but also the likely viewership such that we can now start to compare, for a given service, the relative merits of different ABR ladders.
The talk ends with Nick answering questions on the tendency to see jumps between different resolutions – for instance if we over-optimise and only have two renditions, it would be easy to see the switch – how to compare videos of different resolutions and also on his example user data.
How does the move to OTT delivery impact the traditional platforms? Are there too many streaming services? This session looks at the new platforms, the consumer experience, the role of aggregation and the way that operators have been involved in de-aggregation and then re-aggregation of channel packages both in competition and in cooperation.
How many subscription services are too many for a household? There’s some thinking that 3 may be the typical maximum when people tend to switch to a ‘one in, one out’ policy on subscription packages. Colin Dixon says the average is currently 2 in the UK and Germany. The panel asks whether we should have as many and compares the situation with audio where ‘super aggregation’ rules. Services like Apple Music and Spotify rely on aggregating ‘all’ music and consumers don’t subscribe separately to listen to Sony artists one on service and EMI on another, so what is it that drives video to be different and will it stay that way?
The topic then switches to smart TVs discussing the feeling that five to eight years ago they had a go at app stores and ended up disappointing. Not only was it often clunky at the time, but support has now gone on the whole from the manufacturers. Is the current wave of smart TVs any different? From BT’s perspective, explains Colin Phillips, it’s very costly to keep many different versions of app up to date and tested so a uniform platform across multiple TVs would be a lot better.
The talk concludes looking at the future for Disney+, Netflix and other providers ahead of discussing predictions from industry analysts.
It’s very clear that internet streaming is growing, often resulting in a loss of viewership by traditional over-the-air broadcast. This panel explores the progress of IP-delivered TV, the changes in viewing habits this is already prompting and looks at the future impacts on broadcast television as a result.
Speaking at the IABM Theatre at IBC 2019, Ian Nock, chair of IET Media, sets the scene. He highlights stats such as 61% of Dutch viewing being non-linear, DirecTV publicly declaring they ‘have bought their last transponder’ and discusses the full platform OTT services available in the market place now.
To add detail to this, Ian is joined by DVB, the UK’s DTG and Germany’s Television Platform dealing with transformation to IP within Germany. Yvonne Thomas, from the Digital Television Group, takes to the podium first who starts by talking about the youngest part of the population who have a clear tendency to watch streamed services over broadcast compared to other generations. Yvonne talks about research showing UK consumers being willing to have 3 subscriptions to media services which is not in line with the number and fragmented nature of the options. She then finishes with the DTG manifesto for a consolidated and thus simplified way of accessing multiple services.
Peter Siebert from DVB looks at the average viewing time averaged over Europe which shows that the amount of time spent watching linear broadcast is actually staying stable – as is the amount of time spent watching DVDs. He also exposes the fact that the TV itself is still very much the most used device for watching media, even if it’s not RF-delivered. As such, the TV still provides the best quality of video and shared experience. Looking at history to understand the future, Peter shows a graph of cinema popularity before and after the introduction of television. Cinema was, indeed, impacted but importantly it did not die. We are left to conclude that his point is that linear broadcast will similarly not disappear, but simply have a different place in the future.
Finally, head of the panel session, Andre Prahl explains the role of the Deutsche TV-Plattform who are focussing on ‘media over IP’ with respect to delivery of video to end user both in terms of internet bandwidth but also Wi-Fi frequencies within the home.
HLS has taken the world by storm since its first release 10 years ago. Capitalising on the already widely understood and deployed technologise already underpinning websites at the time, it brought with it great scalability and the ability to seamlessly move between different bitrate streams to help deal with varying network performance (and computer performance!). In the beginning, streaming latency wasn’t a big deal, but with multi-million pound sports events being routinely streamed, this has changed and is one of the biggest challenges for streaming media now.
Low-Latency HLS (LL-HLS) is Apple’s way of bringing down latency to be comparable with broadcast television for those live broadcast where immediacy really matters. The release of LL-HLS came as a blow to the community-driven moves to deliver lower latency and, indeed, to adoption of MPEG-DASH’s CMAF. But as more light was shone on the detail, the more questions arose in how this was actually going to work in practice.
Marina Kalkanis from M2A Media explains how they have been working with DAZN and Akamai to get LL-HLS working and what they are learning in this pilot project. Choosing the new segment sizes and how they are delivered is a key first step in ensuring low latency. M2A are testing 320ms sizes which means very frequent requests for playlists and quickly growing playlist files; both are issues which need to be managed.
Marina explains the use of playlist shortening, use of HTTP Push in HTTP2 to reduce latency, integration into the CDN and what the CDN is required to do. Marina finishes by explaining how they are conducting the testing and the status of the project.