Google have launched a new initiative allowing publishers to highlight key moments in a video so that search results can jump straight to that moment. Whether you have a video that looks at 3 topics, one which poses questions and provides answers or one which has a big reveal and reaction shots, this could help increase engagement.
The plan is the content creators tell Google about these moments so Paul Smith from theMoment.tv takes to the stage at San Francisco Video Tech to explain how. After looking at a live demo, Paul takes a dive into the webpage code that makes it happen. Hidden in the
tag, he shows the script which has its type set to application/ld+json. This holds the metadata for the video as a whole such as the thumbnail URL and the content URL. However it also then defines the highlighted ‘parts’ of the video with URLs for those.
Whiles the programme is currently limited to a small set of content publishers, everyone can benefit from these insights on google video search. It will also look at YouTube descriptions in which some people give links to specific times such as different tracks in a music mix, and bring those into the search results.
Paul looks at what this means for website and player writers. On suggestion is the need to scroll the page to the correct video and make the different videos on a page clearly signposted. Paul also looks towards the future at what could be done to better integrate with this feature. For example updating the player UI to see and create moments or improve the ability to seek to sub-second accuracy. Intriguingly he suggests that it may be advantageous to synchronise segment timings with the beginning of moments for popular video. Certainly food for thought.
With Amazon, Netflix and so many other VOD services available, broadcasters like the BBC and Discovery are investing a lot in their own VOD services, known as Broadcaster VOD (BVOD) in order to maintain relevance, audiences and revenue.
With this in mind, IBC365 will discuss the business models, platforms and strategies being used by BVOD platforms. They will look at the BBC’s move to build a deep content library of free-to-view box sets, and to the importance of data, personalisation and addressable advertising models.
Further more, this webinar will talk about the commercial and technical requirements to build a BVOD to a standard that’s going to stand on its own in this increasingly crowded, but well-funded marketplace.
How does the move to OTT delivery impact the traditional platforms? Are there too many streaming services? This session looks at the new platforms, the consumer experience, the role of aggregation and the way that operators have been involved in de-aggregation and then re-aggregation of channel packages both in competition and in cooperation.
How many subscription services are too many for a household? There’s some thinking that 3 may be the typical maximum when people tend to switch to a ‘one in, one out’ policy on subscription packages. Colin Dixon says the average is currently 2 in the UK and Germany. The panel asks whether we should have as many and compares the situation with audio where ‘super aggregation’ rules. Services like Apple Music and Spotify rely on aggregating ‘all’ music and consumers don’t subscribe separately to listen to Sony artists one on service and EMI on another, so what is it that drives video to be different and will it stay that way?
The topic then switches to smart TVs discussing the feeling that five to eight years ago they had a go at app stores and ended up disappointing. Not only was it often clunky at the time, but support has now gone on the whole from the manufacturers. Is the current wave of smart TVs any different? From BT’s perspective, explains Colin Phillips, it’s very costly to keep many different versions of app up to date and tested so a uniform platform across multiple TVs would be a lot better.
The talk concludes looking at the future for Disney+, Netflix and other providers ahead of discussing predictions from industry analysts.
Honing the use of AI and Machine Learning continues apace. Streaming services are particularly ripe areas for AI, but the winners will be those that have managed to differentiate themselves and innovate in their use of it.
Artificial Intelligence (AI) and Machine Learning (ML) are related technologies which deal with replicating ‘human’ ways of recognising patterns and seeking patterns in large data sets to help deal with similar data in the future. It does this without using traditional methods like using a ‘database’. For the consumer, it doesn’t actually matter whether they’re benefitting from AI or ML, they’re simply looking for better recommendations, wanting better search and accurate subtitles (captions) on all their videos. If these happened because of humans behind the scenes, it would all be the same. But for the streaming provider, everything has a cost, and there just isn’t the ability to afford people to do these tasks plus, in some cases, humans simply couldn’t do the job. This is why AI is here to stay.
Date: Thursday 8th August, 16:00 BST / 11am EDT
In this webinar from IBC365, Media Distillery, Liberty Global and Grey Media come together to discuss the benefits of extracting images, metadata and other context from video, analysis of videos for contextual advertising, content-based search & recommendations and ways to maintain younger viewers.
AI will be here to stay touching the whole breadth of our lives, not just in broadcast. So it’s worth learning how it can be best used to produce television, for streaming and in your business.
Senior Manager, Discovery & Personalisation,
Subscribe to get daily updates
Views and opinions expressed on this website are those of the author(s) and do not necessarily reflect those of SMPTE or SMPTE Members.
This website is presented for informational purposes only. Any reference to specific companies, products or services does not represent promotion, recommendation, or endorsement by SMPTE