How can we make video more appealing to humans? We’ve evolved to live a certain way and this has defined – and will continue to define – our video technologies. MUX founder Jon Dahl talks to us here about the ways in which human physiology drives viewing habits.
Vertical vs. horizontal video, angular resolution and how the typical viewing distances of computers, TVs and other devices affects what resolution we can perceive are all discussed. Jon moves on to frequencies both of audio and video where frame rates and flicker are important and where physics comes into play alongside biology.
Even for the experienced, this talk is bound to bring something new and is a great tour of the fundamentals of the visual perception that our industry relies on and strives to please day in, day out.
This talk was given at Streaming Tech Sweden which is an annual conference from Eyvinn Technology. Streamed on their own video platform, talks are initially available exclusively to all conference attendees, but are released free-to-view during the subsequent year. Free registration is required to watch the videos.
Thursday February 7th, 10am PST / 1pm EST / 18:00 GMT Now available on-demand!
There is so much talk about HDR, wide colour gamut (WCG), ‘Better Pixels’ and all the TVs seem to interpolate motion up to 100Hz or above, that it’s good to stop and check we know why all of this matters – and crucially when it doesn’t.
SMPTE’s new ‘Essential Technology Concepts Webcasts’ are here to help and for the first webcast, David Long will look at the fundamentals of colour, contrast and motion in terms of what we actually see.
This promises to be a great talk and, the chances are, even people who ‘know it already’ will be reminded of a thing or two!
Date: 29th January, 18:30 GMT
Location: University of York, Department of Theatre, Film and Television
The AES North of England invite Cleopatra Pike and Amy V. Beeston to talk about how human psychology and neuroscience are involved in the design of many audio products. Firstly, they can be used to determine whether the products suit the needs of the people they aim to serve. ‘Human-technology interaction’ research is conducted to ascertain how humans respond to audio products – where they help and where they hinder. However, issues remain with this research, such as getting reliable reports from people about their experience.
Secondly, psychology and neuroscience can be used to solve engineering problems via ‘human inspired approaches’ (e.g. they can be used produce robots that listen like humans in noisy environments). To fulfil this aim audio engineers and psychologists must determine the biological and behavioural principles behind how humans listen. However, the human hearing system is a black-box which has developed over years of evolution. This makes understanding and applying human principles to technology challenging.
This evening hosts a discussion on some of the benefits and issues involved in an interdisciplinary approach to developing audio products. We include examples from our research investigating how machine listeners might simulate human hearing in compensating for reverberation and spectral distortion, how machine listeners might achieve the perceptual efficiency of humans by optimally combining multiple senses, and how the input from tests on humans can be used to optimise the function of hearing aids.