Video: Subjective and Objective Quality Assessment

Video quality is a key part of user experience, so understanding how different parts of your distribution chain can affect your video in different ways is an important factor ensuring continued quality in the service and quick fault finding where problems are reported.

Abdul Rehman from SSIMWAVE speaks at the Kitchener-Warterloo Video Technology Meetup explaining both subjective quality assessment where humans judge the quality of the video and objective quality assessments where computers analyse, often terabytes, of video to assess the quality.

Starting with a video showing examples of different problems that can occur in the chain, Abdul explains how many things can go wrong including lost or delayed data, incorrect content and service configuration checks. Display devices, nowadays, come in many shapes, sizes and resolutions which can, in turn, cause impairments with display as can the player and viewing conditions. These are only around half of the different possibilities which include the type of person – a golden eye, or a pure consumer.

In order to test your system, you may need test codecs and you will need test content. Abdul talks about subject rated databases which have images which have certain types of distortions/impairments. After seeing many examples of problem images, Abdul asks the question of who to deal with natural images which look similar or deliberate use, for creative purposes, of distorted videos.

Subjective video quality assessment is one solution to this since it uses people who are much better at detecting creative quality than computers. As such, this avoids many false positives where video may be judged as bad, but there is intent in the use. Moreover, it also represents direct feedback from your target group. Abdul talks through the different aspects of what you need to control for when using subjective video quality assessment in order to maximise its usefulness and allow results from different sessions and experiments to be directly compared.

This is to be compared against objective video quality assessment where a computer is harnessed to plough through the videos. This can be very effective for many applications meaning it can shine in terms of throughput and number of measurements. Additionally, it can make regression testing very easy. The negatives can be cost, false positives and sometimes speed – depending on the application. You then can take your pick of algorithms such as MS-SSIM, VMAF and others. Abdul finishes by explaining more about the benefits and what to look out for.

Watch now!
Speakers

Abdul Rehman Abdul Rehman
Cofounder, CEO and CTO,
SSIMWAVE

Webinar: Assessing Video Quality: Methods, Measurements, and Best Practices

Wednesday, November 13th, 8am PST / 16:00 GMT

Bitmovin have brought together Jan Ozer from the Streaming Learning Center, their very own Sean McCarthy and Carlos Bacquet from SSIM Wave to discuss how best to assess video quality.

Fundamental to assessing video quality, of course, is what we mean by quality, which artefacts are most problematic and what drives the importance of video quality.

Quality of streaming, of course, is interdependent on the quality of the experience in general. Thinking of an online streaming system as a whole, speed of playback, smooth playback on the player itself and rebuffing are all factors of perceived quality as much as the actual codec encoding quality itself which is what is more traditionally measured.

The webinar brings together experience in measuring quality, monitoring systems and ways in which you can derive your own testing to lock on to the factors which matter to you and your business.

See the related posts below for more from Jan Ozer

Register now!
Speakers

Jan Ozer Jan Ozer
Industry Analyst
Jan Ozer
Sean McCarthy Sean McCarthy
Technical Product Marketing Manager,
Bitmovin
Carlos Bacquet Carlos Bacquet
Solutions Architect
SSIM Wave

Video: A Standard for Video QoE Metrics

A standard in progress for quality of experience networks, rebufereing time etc. Under the CTA standards body wanting to create a standard around these metrics. The goal of the group is to come up with a standard set of player events, metrics & terminology around QoE streaming. Concurrent viewers, isn’t that easy to define? If the user is paused, are they concurrently viewing the video? Buffer underruns is called rebuffering, stalling, waiting. Intentionally focussing on what the viewers actually see and experience. QoS is a measurement of how well the platform is performing, not necessarily the same as what they are experiencing.

The standard has ideas of different levels. There are player properties and events which are standardised ways of signalling that certain things are happening. Also Session Metrics are defined which then can feed into Aggregate Metrics. The first set of metrics include things such as playback failure percentage, average playback stalled rate, average startup time and playback rate with the aim of setting up a baseline and to start to get feedback from companies as they implement these, seemingly simple, metrics.

This first release can be found on github.

Watch now!
Speaker

Steve Heffernan Steve Heffernan
Co-Founder, Head of Product,
Mux

Video: Broadcast and OTT monitoring: The challenge of multiple platforms


Is it possible to monitor OTT services to the same standard as traditional broadcast services? How can they be visualised, what are the challenges and what makes monitoring streaming services different?

As with traditional broadcast, some broadcasters outsource the distribution of streaming services to third parties. Whilst this can work well in broadcast, there any channel would be missing out on a huge opportunity if they didn’t also monitor some analytics of the viewer using their streaming service. So, to some extent, a broadcaster always wants to look at the whole chain. Even when the distribution is not outsourced and the OTT system has been developed and is run by the broadcaster, at some point a third party will have to be involved and this is typically the CDN and/or Edge network. A broadcaster would do well to monitor the video provided at all points through the chain including right up to the edge.

The reason for monitoring is to keep viewers happy and, by doing so, reduce churn. When you have analytics from a player telling you something isn’t right, it’s only natural to want too find out what went wrong and to know that, you will need monitoring in your distribution chain. When you have that monitoring, you can be much more pro-active in resolving issues and improve your service overall.

Jeff Herzog from Verizon Digital Media Services explains ways to achieve this and the benefits it can bring. After a primer on HLS streaming, he explains ways to monitor the video itself and also how to monitor everything but the video as a light-touch monitoring solution.

Jeff explains that because HLS is based on playlists and files being available, you can learn a lot about your service just by monitoring these small text files, parsing them and checking that all the files it mentions are available with minimal wait times. By doing this and other tricks, you can successfully gauge how well your service is working without the difficulty of dealing with large volumes of video data. The talk finishes with some examples of what this monitoring can look like in action.

This talk was given at the SMPTE Annual Technical Conference 2018.
For more OTT videos, check out The Broadcast Knowledge’s Youtube OTT playlist.
Speakers

Jeff Herzog Jeff Herzog
Senior Product Manger, Video Monitoring & Compliance,
Verizon Digital Media Services