Video: Subjective and Objective Quality Assessment

Video quality is a key part of user experience, so understanding how different parts of your distribution chain can affect your video in different ways is an important factor ensuring continued quality in the service and quick fault finding where problems are reported.

Abdul Rehman from SSIMWAVE speaks at the Kitchener-Warterloo Video Technology Meetup explaining both subjective quality assessment where humans judge the quality of the video and objective quality assessments where computers analyse, often terabytes, of video to assess the quality.

Starting with a video showing examples of different problems that can occur in the chain, Abdul explains how many things can go wrong including lost or delayed data, incorrect content and service configuration checks. Display devices, nowadays, come in many shapes, sizes and resolutions which can, in turn, cause impairments with display as can the player and viewing conditions. These are only around half of the different possibilities which include the type of person – a golden eye, or a pure consumer.

In order to test your system, you may need test codecs and you will need test content. Abdul talks about subject rated databases which have images which have certain types of distortions/impairments. After seeing many examples of problem images, Abdul asks the question of who to deal with natural images which look similar or deliberate use, for creative purposes, of distorted videos.

Subjective video quality assessment is one solution to this since it uses people who are much better at detecting creative quality than computers. As such, this avoids many false positives where video may be judged as bad, but there is intent in the use. Moreover, it also represents direct feedback from your target group. Abdul talks through the different aspects of what you need to control for when using subjective video quality assessment in order to maximise its usefulness and allow results from different sessions and experiments to be directly compared.

This is to be compared against objective video quality assessment where a computer is harnessed to plough through the videos. This can be very effective for many applications meaning it can shine in terms of throughput and number of measurements. Additionally, it can make regression testing very easy. The negatives can be cost, false positives and sometimes speed – depending on the application. You then can take your pick of algorithms such as MS-SSIM, VMAF and others. Abdul finishes by explaining more about the benefits and what to look out for.

Watch now!
Speakers

Abdul Rehman Abdul Rehman
Cofounder, CEO and CTO,
SSIMWAVE

Webinar: Assessing Video Quality: Methods, Measurements, and Best Practices

Wednesday, November 13th, 8am PST / 16:00 GMT

Bitmovin have brought together Jan Ozer from the Streaming Learning Center, their very own Sean McCarthy and Carlos Bacquet from SSIM Wave to discuss how best to assess video quality.

Fundamental to assessing video quality, of course, is what we mean by quality, which artefacts are most problematic and what drives the importance of video quality.

Quality of streaming, of course, is interdependent on the quality of the experience in general. Thinking of an online streaming system as a whole, speed of playback, smooth playback on the player itself and rebuffing are all factors of perceived quality as much as the actual codec encoding quality itself which is what is more traditionally measured.

The webinar brings together experience in measuring quality, monitoring systems and ways in which you can derive your own testing to lock on to the factors which matter to you and your business.

See the related posts below for more from Jan Ozer

Register now!
Speakers

Jan Ozer Jan Ozer
Industry Analyst
Jan Ozer
Sean McCarthy Sean McCarthy
Technical Product Marketing Manager,
Bitmovin
Carlos Bacquet Carlos Bacquet
Solutions Architect
SSIM Wave

Video: AV1 vs. HEVC: Perceptual Evaluation of Video Encoders

Zhou Wang explains how to compare HEVC & AVC with AV1 and shares his findings. Using various metrics such as VMAF, PSNR and SSIMPlus he explores the affects of resolution on bitrate savings and then turns his gaze to computation complexity.

This talk was given at the Mile High Video conference in Denver CO, 2018.

Speakers

Zhou Wang Zhou Wang
Chief Science Officer,
SSIMWAVE Inc.

Video: Best Practices for Advanced Software Encoder Evaluations

Streaming Media East brings together Beamr, Netflix BAMTECH Media and SSIMWAVE to discuss the best ways to evaluate software encoders and we see there is much overlap with hardware encoder evaluation, too.

The panel gets into detail covering:

  • Test Design
  • Choosing source sequences
  • Rate Control Modes
  • Bit Rate or Quality Target Levls
  • Offline (VOD) vs Live (Linear)
  • Discrete vs. Multi-resolution/Bitrate
  • Subjective vs. objective measurements
  • Encoding Efficiency vs Performance
  • Video vs Still frames
  • PSNR Tuning
  • Evaluation at Encode Resolution Vs Display Resolution

Watch now for this comprehensive ‘How To’

Speakers

Anne Aaron Dr. Anne Aaron
Director of Video Algorithms,
Netflix
Scott Labrozzi Scott Labrozzi
VP Video Processing, Core Media Video Processing,
BAMTECH Media
Zhou Wang Dr. Zhou Wang
Chief Science Officer,
SSIMWAVE
Tom Vaughan Moderator: Tom Vaughan
VP Strategy,
Beamr