Video: State of the Streaming Market 2021

Streaming Media is back to take the pulse of the Streaming market following on from their recent, mid-year survey measuring the impact of the pandemic. This is the third annual snapshot of the state of the streaming market which will be published by Streaming Media in March. To give us this sneak peak, Eric Schumacher-Rasmussen is joined by colleague Tim Siglin and Harmonic Inc.’s Robert Gambino,

They start off with a look at the demographics of the respondents. It’s no surprise that North America is well represented as Streaming Media is US-based and both the USA and Canada have very strong broadcast markets in terms of publishers and vendors. Europe is represented to the tune of 14% and South America’s representation has doubled which is in line with other trends showing notable growth in the South American market. In terms of individuals, exec-level and ‘engineering’ respondents were equally balanced with a few changes in the types of institutions represented. Education and houses of worship have both grown in representation since the last survey.

Of responding companies, 66% said that they both create and distribute content, a percentage that continues to grow. This is indicative, the panel says, of the barrier to entry of distribution continuing to fall. CDNs are relatively low cost and the time to market can be measured in weeks. Answering which type of streaming they are involved in, live and on-demand were almost equal for the first time in this survey’s history. Robert says that he’s seen a lot of companies taking to using the cloud to deliver popups but also that streaming ecosystems are better attuned to live video than they used to be.

Reading the news, it seems that there’s a large migration into the cloud, but is that shown in the data? When asked about their plans to move to the cloud, around a third had already moved but only a quarter said they had no plans. This means there is plenty of room for growth for both cloud platforms and vendors. In terms of the service itself, video quality was the top ‘challenge’ identified followed by latency, scalability and buffering respectively. Robert points out better codecs delivering lower bitrates helps alleviate all of these problems as well as time to play, bandwidth and storage costs.

There have been a lot of talks on dynamic server-side ad insertion in 2020 including for use with targetted advertising, but who’s actually adopting it. Over half of respondents indicated they weren’t going to move into that sphere and that’s likely because many governmental and educational services don’t need advertising to start with. But 10% are planning to implement it within the next 12 months which represents a doubling of adoption, so growth is not slow. Robert’s experience is that many people in ad sales are still used to selling on aggregate and don’t understand the power of targetted advertising and, indeed, how it works. Education, he feels, is key to continuing growth.

The panel finishes by discussing what companies hope to get out of the move to virtualised or cloud infrastructure. Flexibility comes in just above reliability with cost savings only being third. Robert comes back to pop-up channels which, based on the release of a new film or a sports event, have proved popular and are a good example of the flexibility that companies can easily access and monetise. There are a number of companies that are heavily investing in private cloud as well those who are migrating to public cloud. Either way, these benefits are available to companies who invest and, as we’re seeing in South America, cloud can offer an easy on-ramp to expanding both scale and feature-set of your infrastructure without large Capex projects. Thus it’s the flexibility of the solution which is driving expansion and improvements in quality and production values.

Watch now!
Speakers

Tim Siglin Tim Siglin
Contributing Editor, Streaming Media Magazine
Founding Executive Director, HelpMeStream
Robert Gambino Robert Gambino
Director of Solutions,
Harmonic Inc.
Eric Schumacher-Rasmussen Moderator: Eric Schumacher-Rasmussen
Editor, Streaming Media

Video: Esports Production During COVID

Esports continues to push itself into to harness the best of IT and broadcast industries to bring largescale events to half a billion people annually. Natrually, the way this is done has changed with the pandemic, but the 10% annual growth remains on track. The esports market is still maturing and while it does, the industry is working hard on innovating with the best technology to bring the best quality video to viewers and to drive engagement. Within the broadcast industry, vendors are working hard to understand how best to serve this market segment which is very happy to adopt high-quality, low latency solutions and broadcasters are asking whether the content is right for them.

Takling all of these questions is a panel of experts brought together by SMPTE’s Washington DC section including Christopher Keath from Blizzard Entertainment, Mark Alston from EA, Scott Adametz from Riot Games, Richard Goldsmith with Delloite and, speaking in January 2021 while he worked for Twitch, Jonas Bengtson.

First off the bat, Michael introduced the esports market. With 2.9 billion people playing games globally and 10% growth year-on-year, he says that it’s still a relatively immature market and then outlines some notable trends. Firstly there is a push to grow into a mainstream audience. To its benefit, esports has a highly loyal and large fanbase, but growth outside of this demographic is still difficult. In this talk and others, we’ve heard of the different types of accompanying, secondary programmes aimed more at those who are interested enough to have a summary and watch a story being told, but not interested in watching the blow-by-blow 8 hour tournament.

Another trend outlined by Michael is datasharing. There are so many stats available both in terms of the play itself, similar to traditional sports ‘percentage possession’ stats, but also factual data which can trigger graphics such as names, affiliations, locations etc. Secondary data processing, just like traditional sports, is also a big revenue opportunity, so the market, explains Michael, is still working on bigger and better ways to share data for mutual benefit. More information on Deloitte’s opinion of the market is in this article with a different perspective in this global esports market report

You can watch either with this Speaker view or Gallery view

The panel discusses the different angle that esports has taken on publishing with many young producers only knowing the free software ‘OBS‘, underlined by Scott who says esports can still be scrappy in some places, bringing together unsynchronised video sources in a ‘democratised’ production which has both benefits and downsides. Another difference within esports is that many viewers have played the games, often extensively. They therefore know exactly what they look like so watching the game streamed can feel a very different experience after going through, portentially multiple stages of, encoding. The panel all spend a lot of time tuning encoders for different games to maintain the look as best as possible.

Christopher Keath explains what observers are. Effectively these are the in-game camera operators which talk to the head observer who co-ordinates them and has a simple switcher to make some available to the production. This leads to a discsussion on how best to bring the observer’s video, during the pandemic, into the programmes. Riot has kitted out the PCs in observers’ homes to bring them up to spect and allow them to stream out whereas EA has moved the observer PCs into their studio, backed by hefty internet links.

Jonas points out that Twitch brings tens of thousands of streams to the internet constantly and outlines that the Twitch angle on streaming is often different to the ‘esports’ angle of big events, rather they are personality driven. The proliferation of streaming onto Twitch, other similar services and as part of esports itself has driven GPU manufacturers, Jonas continues, to include dedicated streaming functionality on the GPUs to stop encoding detracting from the in-game performance. During the pandemic, Twitch has seen a big increase in social games, where interaction is more key rather than team-based competition games.

You can watch either with the Speaker view or this gallery view

Scott talks about Riot’s network global backbone which saw 3.2 petabytes of data flow – just for production traffic – during the League of Legends Worlds event which saw them produce the event in 19 different languages working between Berlin, LA and Shanghai. For him, the pandemic brought a change in the studio where everything was rendered in realtime in the unreal game engine. This allowed them to use augmented reality and have a much more flexible studio which looked better than the standard ‘VR studios’. He suggests they are likely to keep using this technology.

Agreeing with this by advocating a hybrid approach, Christopher says that the reflexes of the gamers are amazing and so there really isn’t a replacement for having them playing side-by-side on a stage. On top of that, you can then unite the excitement of the crowd with lights, smoke and pyrotechnics so that will still want to stay for some programmes, but cloud production is still a powerful tool. Mark agrees with that and also says that EA are exploring the ways in which this remote working can improve the work-life balance.

The panel concludes by answering questions touching on the relative lack of esports on US linear TV compared to Asia and eslewhere, explaining the franchise/league structures, discussing the vast range of technology-focused jobs in the sector, the unique opportunities for fan engagement, co-streaming and the impact of 5G.

Watch now!
Speakers

Mark Alston Mark Alston
Technical production manager
Electronic Arts (EA)
Christopher Keath Christopher Keath
Broadcast Systems Architect
Blizzard Entertainment
Jonas Bengtson Jonas Bengtson
Senior Engineering Manager, Discord
Formerly, Director at Twitch
Scott Adametz Scott Adametz
Senior Manager, Esports Engineering,
Riot Games
Richard Goldsmith Richard Goldsmith
Manager,
Deloitte Consulting

Video: SMPTE Presents – MovieLabs 2030

Workflows changed when we moved from tapes to files, so it makes sense they will change as they increasingly move to the cloud. What if they changed to work in the cloud, but also benefited on-prem workflows at the same time? The work at MovieLabs is doing just that. With the cloud, infrastructure can be dynamic and needs to work between companies so data needs to be able to flow between companies easily and in an automated way. If this can be achieved, the amount of human interaction can be reduced and focused on the creative parts of content making.

This panel from at this SMPTE Hollywood section meeting discusses the work underway moderated by Greg Ciaccio at ETC. First off, Jim Helman from MoveLabs gives an overview of the 2030 vision. Kicked off at IBC 2019 with the publication of ‘Evolution of Media Creation‘, 10 principles were laid out for building the future of production covering topics like realtime engines, remote working, cloud deployments, security access and software-defined workflows (SDWs). This was followed up by a white paper covering ‘The Evolution of Production Security‘ which laid out the need for a new, zero-trust approach to security and then, in 2020, the jointly run industry lab released ‘The Evolution of Production workflows‘ which talked about SDWs.

“A software-defined workflow (SDW) uses a highly configurable set of tools and processes to support creative tasks by connecting them through software-mediated collaboration and automation.”

This SDW thread of the MovieLabs 2030 vision aims to standardise workflows at the data-model level, and, in the future, API level to allow for data to be easily exchanged and understood. Annie Chang from Universal Pictures explains that the ‘Evolution of Production Workflows’ publication deals with Tasks, Assets, Relationships and participants. If you can define your work with these four areas, you have enough information to get computers to understand the workflows and external data.

This leads to the idea of building each scene in an object model showing relationships between them. The data would describe key props and stage elements (for instance, if they were important for VFX), actors and their metadata and technical information such as colour correction, camera files and production staff.

Once modelled, a production can be viewed in many ways. For instance, the head of a studio may just be interested in high-level stories, people involved and distribution routes. Whereas a VFX producer would have a different perspective needing more detail about the scene. A VFX asset curator, on the other hand, would just need to know about the shots down to the filenames and storage locations. This work promises to allow all of these views of the same, dynamic data. So this not only improves workflows’ portability between vendor systems but is also a way of better organising any workflow irrespective of automation. Dreamworks is currently using this method of working with the aim of trying it out on live-action projects soon.

Annie finishes by explaining that there are efficiencies to be had in better organising assets. It will help reduce duplicates both by uncovering duplicate files but also stopping duplicate assets being produced. AI and similar technology will be able to sift through the information to create clips, uncover trivia and, with other types of data mining, create better outputs for inclusion in viewer content.

Sony Pictures’ Daniel De La Rosa then talks about the Avid platform in the cloud that they build in response to the Covid crisis and how cloud infrastructure was built in order of need and, often, based on existing solutions which were scaled up. Daniel makes the point the working in the cloud is different because it’s “bringing the workflows to the data” as opposed to the ‘old way’ where the data was brought to the machine. In fact, cloud or otherwise, with the globalisation of production there isn’t any way of doing things the ‘old way’ any more.

This reliance on the cloud – and to be clear, Daniel talks of multi-cloud working within the same production – does prompt a change in the security model employed. Previously a security perimeter would be set up around a location, say a building or department, to keep the assets held within safe. They could then be securely transferred to another party who had their own perimeter. Now, when assets are in the cloud, they may be accessed by multiple parties. Although this may not always happen simultaneously, through the life of the asset, this will be true. Security perimeters can be made to work in the cloud, but they don’t offer the fine-grained control that’s really needed where you really need to be able to restrict the type of access as well as who can access on a file-by-file basis. Moreover, as workflows are flexible, these security controls need to be modified throughout the project and, often, by the software-defined workflows themselves without human intervention. There is plenty of work to do to make this vision a reality.

The Q&A section covers exactly feedback from the industry into these proposals and specifications, the fact they will be openly accessible, and a question on costs for moving into the cloud. On the latter topic, Daniel said that although costs do increase they are offset when you drop on-premise costs such as rent and utilities. Tiered storage costs in the cloud will be managed by the workflows themselves just like MAMs manage asset distribution between online, near-line and LTO storage currently.

The session finishes talking about how SDWs will help automation and spotting problems, current gaps in cloud workflow tech (12-bit colour grading & review to name but one) and VFX workflows.

Watch now!
Speakers

Jim Helman Jim Helman
MovieLabs CTO
Daniel De La Rosa Daniel De La Rosa
Sony Pictures’ Vice President of Post Production, Technology Development
Annie Chang
Universal Pictures Vice President of Creative Technologies
Greg Ciaccio Moderator: Greg Ciaccio
ASC Motion Imaging Technology Council Workflow Chair
EP and Head of Production Technology & Post, ETC at University Southern California

Video: In Stadium Production Workflow and COVID 19

As COVID-19 has wrought many changes to society, this webinar looks to see how it’s changed the way live sports is produced by the in-stadium crews producing the TV shows the cover the ups and downs of our beloved teams. We know that crews have had to be creative, but what has that actually looked like and has it all been negative?

The SMPTE Philadelphia section first invites Martin Otremsky to talk about how his work within the MLB has changed. Before the beginning of the season, Martin and team weren’t allowed in the stadium and had very little notice of the first game for which they were allowed to prepare.

Crowd noise was a big issue. Not only was the concern that the players would find it offputting in a silent stadium, but it was soon realised that swearing and tactics chat could easily be heard by other players and the mics. Bringing back crowd noise helped mask that and allow the teams to talk normally.

The crowd noise was run off three iPads. One which ran a 3-hour loop of general crowd noise and then two which had touch screen buttons to trigger different types/moods of effects dealing with the noises of anticipation and reactions, when the crowd would be at ease and when they would be ‘pumped’. This was run on the fly by two operators who would keep this going throughout the game. The crowd noise required a fair bit of fine-tuning including getting the in-stadium acoustics right as the speaker system is set up to assume the stands are lined with absorbent people; without a crowd, the acoustics are much more echoey.

Link to video

The Covid protections dealt with people in 3 tiers. Tier 1 was for the players and coaches all of whom were tested every 48 hours and where they would need to be. Tier 3 was for people who could go everywhere not in Tier 1. Tier 2 was one or two people who were allowed in both but under extreme restrictions. As such the crew in Tier 3 found it hard to do a lot of maintenance/configuration in certain places as they had to leave every time Tier 1 staff needed to be in the area.

The operation itself had been pared down from 28 to 9 people which was partly achieved by simplifying the programme itself. The ballpark often had to be flipped to accommodate another team using it as their ‘home’ stadium which caused a lot of work as graphics would have to be reworked to fit the in-stadium graphics boards. Crowd noise would have to cheer for a different team and the video & production graphics had to be swapped out. Matin ends his presentation with a Q&A session.

Next up is Carl Mandell for the Philadelphia Union football/soccer team. They had been lucky enough to be at the end of a stadium technical refresh of their LED displays and their control room, moving up to 1080p60 HDR. Alas, the COVID restrictions hit just before the home opener, robbing them of the chance to use all the new equipment. Their new cameras, for instance, remained unused until right near the end of the season.

Link to video

Unlike the MLB, Carl explains they didn’t choose to do crowd audio in the stadium itself. They ran virtual graphics to fulfil contractual obligations and to brighten up the relatively empty stadium. Unlike MLB, they were able to have a limited crowd in the stadium during most matches.

For the crowd noise used in the broadcast programme, they used audio taken from their ten-year archive of previous games and allowed the chants and reactions to be under the control of fans.

One COVID success was moving the press conferences on Zoom. Whilst the spokesperson/people were actually in the stadium with a PTZ camera in front of them and an 85″ TV to the side, all the media were brought in on Zoom. Carl says they found that this format produced much more engaging conferences which increased their international appeal and raised the viewership of the press conferences.

Watch now!
Speakers

Carl Mandell Carl Mandell
Director, Broadcasting & Video Production
Philadelphia Union
Martin Otremsky Martin Otremsky
Director of Video Engineering,
Philadelphia Phillies