Video: Digital Media Trends of 2020

Research from before and during the pandemic paints a clear picture of how streaming has changed. This Deloitte research looked at ad-supported and subscription VOD across demographics as well as looking at how the film industry has faired as cinemas have remained closed in most places.

Jeff Loucks presents the results of surveys taken in the United States before the lockdown and then again in May and October 2020. The youngest demographic tracked is Gen Z born between 1997 and 2006, the oldest being ‘matures’ who are older than 73. The most critical measurement is the amount of money people have in their pocket. Around half said their finances were unchanged, up to 39% said their pay packet had reduced either somewhat or significantly, though this reduced to only 29% in October.

When including streaming music, video games audiobooks, US consumers had an average of 12 entertainment subscriptions which reduced to 11 by October. Concentrating on paid video subscriptions only, the average grew from 3 to 5 over the period of the research, with millennials leading the charge up to 7 services. However, churn also increased. Jeff explains that this is partly because free trials come to an end but also because people are judging services as too expensive. It seems that there is a certain amount of experimentation going on with people testing new combinations of services to find the mix that suits them.

 

 

Jeff makes the point that there are around 300 paid streaming services in the US market which is ‘too many to stick around’. Whilst it’s clear that streaming providers are giving consumers the types of services they’ve been wanting from cable providers for years, they are bringing a burden of complexity with them, too.

Hulu and YouTube are two services that give the flexibility of watching an ad-supported version or an ad-free version of the service. Across the market, 60% of people use at least one free ad-supported service. Whilst Hulu’s ad-supported service isn’t free, giving these options is a great way to cater to different tastes. the Deloitte research showed that whilst Gen Z and Millenials would prefer to pay for an ad-free service, older ‘boomers and ‘matures’ would rather use an ad-supported service. Furthermore, when given the option to pay a little for half the ads, customers prefer the extremes rather than the halfway house. Overall, 7 minutes of ads an hour is the number which people say is the right balance, with 14 being too many,

Films have been hit hard by the pandemic, but by the end of the pandemic, 35% of people said they had paid to watch a new release on a streaming platform up 13% from May and 90% said they would likely do it again. Theatrical release windows have been under examination for many years now, but the pandemic really forced the subject. The percentage of revenue made during the ‘DVD release’ period has gone down over the decades. Nowadays, a film makes most of its money, 45%, during its theatrical release window with the ‘TV’ revenue being squeezed down 10% to 18% of the overall revenue. It’s clear then, that studios will be careful with that 45% share to ensure it’s suitably replaced as they move ahead with their 2022 plans.

Each genre has its own fingerprint with comedy and dramas making less money in the box office, proportionally than animations and action movies, for instance. So whilst we may see notable changes in distribution windows, they may be more aggressive for some releases than others when the pandemic has less of a say in studios’ plans.

This video is based on research that can be read in much more detail here:

Digital Media Trends Consumption Habits Survey

Future of the Movie Industry

Watch now!
Speakers

Jeff Loucks Dr. Jeff Loucks
Executive Director,
Deloitte Center for Technology, Media & Telecommunications

Video: ATSC 3.0 in 2021

ATSC3.0 is an innovative set of standards that gets closer to the maximum possible throughput, AKA the Shannon limit, than 5G and previous technologies. So flexible a technology it is that it allows convergence with 5G networks, itself in the form of an SFN and inter-transmitter links as well as a seamless handoff for receivers between the internet and the broadcast transmission. ATSC 3.0 is an IP-based technology that is ready to keep up to date with changing practices and standards yet leave viewers to experience the best of broadcast RF transmission, wireless internet and broadband without having to change what they’re doing or even know which one(s) they’re watching.

This SMPTE event looks at a number of ATSC’s innovations, moderated by SMPTE Toronto section chair, Tony Meerakker and kicks off with Orest Sushko from the Humber Broadcast-Broadband Convergence Lab development in Toronto. This is a Canadian initiative to create an environment where real-world ATSC 3.0 testing can be carried out. It’s this type of lab that can help analyse the applications talked about in this video where different applications are brought into a broadcast RF environment including integration with 5G networks. It will also drive the research into ATSC 3.0 adoption in Canada.

Next is the ATSC president, Madeleine Noland, who introduces what ATSC 3.0 is and why she feels its such an innovative standards suite. Created by over 400 engineers throughout the world, Madeleine says that ATSC 3.0 is a state of the art standard with the aims to add value to the broadcast service with the idea that broadcast towers are ‘not just TV anymore. This idea of blurring the lines between traditional RF transmission and other services continues throughout this series of talks.

The aim of ATSC 3.0 is to deliver all television over IP, albeit it uni-directional IP. It also uses a whole stack of existing technologies at the application layer such as HTML5, CSS and JavaScript. These are just three examples of the standards on which ATSC 3.0 is based. Being based on other standards increases the ability to deploy quickly. ATSC 3.0 is a suite of tens of standards that describe the physical layer, transport, video & audio use, apps and more. Having many standards within is another way ATSC 3.0 can keep up with changes; by modifying the relevant standards and updating them but also not being afraid of adding more.

Madeleine says that 62 market areas will be launching which bring the reach of ATSC 3.0 up to 75% of the households in the US under the banner ‘NextGen TV’ which will act as a logo signpost for customers on TVs and associated devices. ATSC 3.0 exists outside the US in Korea where 75% of the population can receive ATSC 3.0. Canada is exploring, Brazil is planning, India’s TSDSI is researching and many other countries like Australia are also engaging with the ATSC to consider their options for national deployment against, presumably DVB-I.

The last point in this section is that when you convert all your transmitters to IP it seems weird to have just a load of ‘node’. Madeleine’s point is that a very effective mesh network could be created if only we could connect all these transmitters together. These could then provide some significant national services which will be discussed later in this video.

Interactive TV

Mark Korl is next talking about his extensive work creating an interactive environment within ATSC 3.0. The aim here was to enhance the viewer/user experience, have better relationships with them and provide an individualised offering including personalised ads and content.

Mark gives an overview of A/244, ATSC 3.0 Interactive Content and ATSC 3.0 standard A/338 talking about signalling, delivery, synchronisation and error protection, service announcement such as EPG, content recovery in redistribution scenarios, watermarking, Application event delivery, security and more.

Key features of interactivity are the aforementioned use of HTML 5, CSS and JavaScript to create seamless and secure delivery of interactive content from broadcast and broadband. Each application lives in its own separate context and features are managed via API.

Mark finishes by outlining the use of the Emergency Advanced Emergence informAtion table which signals everything the receiver needs to know about the AEA message and associated rich media and then looks at how, at the client, content/ads can be replaced by manipulating the .mpd manifest file with locally-downloaded content using XLink references.

Innovateive technologies implemented in ATSC 3.0

Dr. Yiyan Wu takes the podium next explaining the newest RF-based techniques used in ATSC 3.0 which are managing to get ATSC3.0 closer to the Shannon limit than other similar contemporary technologies such as 4F and 5G New Radio (NR). These technologies are such as LDPC – Low Dennsity Parity Codes – which have been in DVB-S2 and DVB-T2 for a long time but also Non-Uniform Constellations such as 4096NUC-QAMas well as Canada-invented Layered-Devision-Multiplexing (LDM) that can efficiently combine robust mobile and high-datarate service on top of each other on a single TV channel. This works by having a high-power, robust-coded signal with a quieter signal underneath which, in good situations, can still be decoded. The idea is that the robust signal is the HD transmitted with HEVC SVC (Scalable Video Coding) meaning that the UHD layer can be an enhancement layer on top of the HD. There is no need to send the whole UHD signal. Dr. Yuyan Wu finishes this section by explaining the LDM offers reduced Tx and Rx power.

Using LDM, however, we’re actually creating more bandwidth than we had before. Dr. Wu points out that this can be used for improved services or be used for an in-band distribution link, i.e. to move live video down through a network of transmitters. While not necessary the fact that an ATSC 3.0 transmitter can operate as part of a single frequency network is very useful as a weak signal from one transmitter can be boosted by the signal from another.

Dr. Wu spends time next talking about 5G use cases detailing the history of failed attempts at ‘broadcast’ versions of 3G, 4G and LTE. With 5G USPs such as network slicing, the current version of the broadcast mode of 5G is more likely than ever to be commercially implemented. Called 5G feMBMS, it’s actually a 4G/LTE-based technology for delivery over a 5G network.

One plan for 5G integration, which is possible as ATSC 3.0 has the same timing reference as 5G networks, is for 5G networks to spot when thousands of people are watching the same things and move that traffic distribution over to the ATSC 3.0 towers who can do multicast would an issue.

Next Gen commercialisation update

Last in the video we have Anne Schelle who works with the ATSC as a board chair of the ATSC 3.0 Security Alliance. She explains that the number of markets announcing for deployment in 2021 is twice that of 2020. Deployment of ATSC 3.0 is going well, the most common initial use has been to test interactive services. The projected TV shipping numbers with ATSC 3.0 internally are positive and, Anne says, the economics for NextGen receiver inclusion is better than it has been previously. Speaking from her security perspective, having in-built content security protection is new for broadcasters who welcome it as it helps reduce piracy

Watch now!
Speakers

Madeleine Noland Madeleine Noland
President,
ATSC
Mark Corl Mark Corl
Chair ATSC S38 Specialist Group on Interactive Environment
SVP Emergent Technology Development, Triveni Digital
Yiyan Wu M.C. Dr. Yiyan Wu
Principal Research Scientist,
Communications Research Centre Canada (CRC)
Anne Schelle Anne Schelle
Board chair of ATSC 3.0 Security Alliance
Board Member, ATSC
Managing Director, Pearl TV
Orest Sushko Orest Sushko
Project Lead, Humber Broadcast-Broadband Conergence Lab
Program Coordinator, Film & Multiplatform Storytelling Program, Humber College
Tony Meerakker Moderator: Tony Meerakker
Chair, SMPTE Toronto Section
Consultant, Meer Tech Systems

Video: Esports Production During COVID

Esports continues to push itself into to harness the best of IT and broadcast industries to bring largescale events to half a billion people annually. Natrually, the way this is done has changed with the pandemic, but the 10% annual growth remains on track. The esports market is still maturing and while it does, the industry is working hard on innovating with the best technology to bring the best quality video to viewers and to drive engagement. Within the broadcast industry, vendors are working hard to understand how best to serve this market segment which is very happy to adopt high-quality, low latency solutions and broadcasters are asking whether the content is right for them.

Takling all of these questions is a panel of experts brought together by SMPTE’s Washington DC section including Christopher Keath from Blizzard Entertainment, Mark Alston from EA, Scott Adametz from Riot Games, Richard Goldsmith with Delloite and, speaking in January 2021 while he worked for Twitch, Jonas Bengtson.

First off the bat, Michael introduced the esports market. With 2.9 billion people playing games globally and 10% growth year-on-year, he says that it’s still a relatively immature market and then outlines some notable trends. Firstly there is a push to grow into a mainstream audience. To its benefit, esports has a highly loyal and large fanbase, but growth outside of this demographic is still difficult. In this talk and others, we’ve heard of the different types of accompanying, secondary programmes aimed more at those who are interested enough to have a summary and watch a story being told, but not interested in watching the blow-by-blow 8 hour tournament.

Another trend outlined by Michael is datasharing. There are so many stats available both in terms of the play itself, similar to traditional sports ‘percentage possession’ stats, but also factual data which can trigger graphics such as names, affiliations, locations etc. Secondary data processing, just like traditional sports, is also a big revenue opportunity, so the market, explains Michael, is still working on bigger and better ways to share data for mutual benefit. More information on Deloitte’s opinion of the market is in this article with a different perspective in this global esports market report

You can watch either with this Speaker view or Gallery view

The panel discusses the different angle that esports has taken on publishing with many young producers only knowing the free software ‘OBS‘, underlined by Scott who says esports can still be scrappy in some places, bringing together unsynchronised video sources in a ‘democratised’ production which has both benefits and downsides. Another difference within esports is that many viewers have played the games, often extensively. They therefore know exactly what they look like so watching the game streamed can feel a very different experience after going through, portentially multiple stages of, encoding. The panel all spend a lot of time tuning encoders for different games to maintain the look as best as possible.

Christopher Keath explains what observers are. Effectively these are the in-game camera operators which talk to the head observer who co-ordinates them and has a simple switcher to make some available to the production. This leads to a discsussion on how best to bring the observer’s video, during the pandemic, into the programmes. Riot has kitted out the PCs in observers’ homes to bring them up to spect and allow them to stream out whereas EA has moved the observer PCs into their studio, backed by hefty internet links.

Jonas points out that Twitch brings tens of thousands of streams to the internet constantly and outlines that the Twitch angle on streaming is often different to the ‘esports’ angle of big events, rather they are personality driven. The proliferation of streaming onto Twitch, other similar services and as part of esports itself has driven GPU manufacturers, Jonas continues, to include dedicated streaming functionality on the GPUs to stop encoding detracting from the in-game performance. During the pandemic, Twitch has seen a big increase in social games, where interaction is more key rather than team-based competition games.

You can watch either with the Speaker view or this gallery view

Scott talks about Riot’s network global backbone which saw 3.2 petabytes of data flow – just for production traffic – during the League of Legends Worlds event which saw them produce the event in 19 different languages working between Berlin, LA and Shanghai. For him, the pandemic brought a change in the studio where everything was rendered in realtime in the unreal game engine. This allowed them to use augmented reality and have a much more flexible studio which looked better than the standard ‘VR studios’. He suggests they are likely to keep using this technology.

Agreeing with this by advocating a hybrid approach, Christopher says that the reflexes of the gamers are amazing and so there really isn’t a replacement for having them playing side-by-side on a stage. On top of that, you can then unite the excitement of the crowd with lights, smoke and pyrotechnics so that will still want to stay for some programmes, but cloud production is still a powerful tool. Mark agrees with that and also says that EA are exploring the ways in which this remote working can improve the work-life balance.

The panel concludes by answering questions touching on the relative lack of esports on US linear TV compared to Asia and eslewhere, explaining the franchise/league structures, discussing the vast range of technology-focused jobs in the sector, the unique opportunities for fan engagement, co-streaming and the impact of 5G.

Watch now!
Speakers

Mark Alston Mark Alston
Technical production manager
Electronic Arts (EA)
Christopher Keath Christopher Keath
Broadcast Systems Architect
Blizzard Entertainment
Jonas Bengtson Jonas Bengtson
Senior Engineering Manager, Discord
Formerly, Director at Twitch
Scott Adametz Scott Adametz
Senior Manager, Esports Engineering,
Riot Games
Richard Goldsmith Richard Goldsmith
Manager,
Deloitte Consulting

Video: As Time Goes by…Precision Time Protocol in the Emerging Broadcast Networks

How much timing do you need? PTP can get you timing in the nanoseconds, but is that needed, how can you transport it and how does it work? These questions and more are under the microscope in this video from RTS Thames Valley.

SMPTE Standards Vice President, Bruce Devlin introduces the two main speakers by reminding us why we need timing and how we dealt with it in the past. Looking back to the genesis of television, points out Bruce, everything was analogue and it was almost impossible to delay a signal at all. An 8cm, tightly wound coil of copper would give you only 450 nanoseconds of delay alternatively quartz crystals could be used to create delays. In the analogue world, these delays were used to time signals and since little could be delayed, only small adjustments were necessary. Bruce’s point is that we’ve swapped around now. Delays are everywhere because IP signals need to be buffered at every interface. It’s easy to find buffers that you didn’t know about and even small ones really add up. Whereas analogue TV got us from camera to TV within microseconds, it’s now a struggle to get below two seconds.

Hand in hand with this change is the change from metadata and control data being embedded in the video signal – and hence synchronised with the video signal – to all data being sent separately. This is where PTP, Precision Time Protocol, comes in. An IP-based timing mechanism that can keep time despite the buffers and allow signals to be synchronised.

 

 

Next to speak is Richard Hoptroff whose company works with broadcasters and financial services to provide accurate time derived from 4 satellite services (GPS, GLONASS etc) and the Swedish time authority RiSE. They have been working on the problem of delivering time to people who can’t put up antennas either because they are operating in an AWS datacentre or broadcasting from an underground car park. Delivering time by a wired network, Richard points out, is much more practical as it’s not susceptible to jamming and spoofing, unlike GPS.

Richard outlines SMPTE’s ST 2059-2 standard which says that a local system should maintain accuracy to within 1 microsecond. the JT-NM TR1001-1 specification calls for a maximum of 100ms between facilities, however Richard points out that, in practice, 1ms or even 10 microseconds is highly desired. And in tests, he shows that with layer 2, PTP unicast looping around western Europe was able to adhere to 1 microsecond, layer 3 within 10 microseconds. Over the internet, with a VPN Richard says he’s seen around 40 microseconds which would then feed into a boundary clock at the receiving site.

Summing up Richard points out that delivering PTP over a wired network can deliver great timing without needing timing hardware on an OPEX budget. On top of that, you can use it to add resilience to any existing GPS timing.

Gerard Philips from Arista speaks next to explain some of the basics about how PTP works. If you are interested in digging deeper, please check out this talk on PTP from Arista’s Robert Welch.

Already in use by many industries including finance, power and telecoms, PTP is base on IEEE-1588 allowing synchronisation down to 10s of nanoseconds. Just sending out a timestamp to the network would be a problem because jitter is inherent in networks; it’s part and parcel of how switches work. Dealing with the timing variations as smaller packets wait for larger packets to get out of the way is part of the job of PTP.

To do this, the main clock – called the grandmaster – sends out the time to everyone 8 times a second. This means that all the devices on the network, known as endpoints, will know what time it was when the message was sent. They still won’t know the actual time because they don’t know how long the message took to get to them. To determine this, each endpoint has to send a message back to the grandmaster. This is called a delay request. All that happens here is that the grandmaster replies with the time it received the message.

PTP Primary-Secondary Message Exchange.
Source: Meinberg [link]

This gives us 4 points in time. The first (t1) is when the grandmaster sent out the first message. The second (t2) is when the device received it. t3 is when the endpoint sent out its delay request and t4 is the time when the master clock received that request. The difference between t2 and t1 indicates how long the original message took to get there. Similarly, t4-t3 gives that information in the other direction. These can be combined to derive the time. For more info either check out Arista’s talk on the topic or this talk from RAVENNA and Meinberg from which the figure above comes.

Gerard briefly gives an overview of Boundary Clock which act as secondary time sources taking pressure off the main grandmaster(s) so they don’t have to deal with thousands of delay requests, but they also solve a problem with jitter of signals being passed through switches as it’s usually the switch itself which is the boundary clock. Alternatively, Transparent Clock switches simply pass on the PTP messages but they update the timestamps to take account of how long the message took to travel through the switch. Gerard recommends only using one type in a single system.

Referring back to Bruce’s opening, Gerard highlights the need to monitor the PTP system. Black and burst timing didn’t need monitoring. As long as the main clock was happy, the DA’s downstream just did their job and on occasion needed replacing. PTP is a system with bidirectional communication and it changes depending on network conditions. Gerard makes a plea to build a monitoring system as part of your solution to provide visibility into how it’s working because as soon as there’s a problem with PTP, there could quickly be major problems. Network switches themselves can provide a lot of telemetry on this showing you delay values and allowing you to see grandmaster changes.

Gerard’s ‘Lessons Learnt’ list features locking down PTP so only a few ports are actually allowed to provide time information to the network, dealing carefully with audio protocols like Dante which need PTP version 1 domains, and making sure all switches are PTP-aware.

The video finishes with Q&A after a quick summary of SMPTE RP 2059-15 which is aiming to standardise telemetry reporting on PTP and associated information. Questions from the audience include asking how easy it is to do inter-continental PTP, whether the internet is prone to asymmetrical paths and how to deal with PTP in the cloud.

Watch now!
Speakers

Bruce Devlin Bruce Devlin
Standards Vice President,
SMPTE
Gerard Phillips Gerard Phillips
Systems Engineer,
Arista
Richard Hoptroff Richard Hoptroff
Founder and CTO
Hoptroff London Ltd