Video: SMPTE Presents – MovieLabs 2030

Workflows changed when we moved from tapes to files, so it makes sense they will change as they increasingly move to the cloud. What if they changed to work in the cloud, but also benefited on-prem workflows at the same time? The work at MovieLabs is doing just that. With the cloud, infrastructure can be dynamic and needs to work between companies so data needs to be able to flow between companies easily and in an automated way. If this can be achieved, the amount of human interaction can be reduced and focused on the creative parts of content making.

This panel from at this SMPTE Hollywood section meeting discusses the work underway moderated by Greg Ciaccio at ETC. First off, Jim Helman from MoveLabs gives an overview of the 2030 vision. Kicked off at IBC 2019 with the publication of ‘Evolution of Media Creation‘, 10 principles were laid out for building the future of production covering topics like realtime engines, remote working, cloud deployments, security access and software-defined workflows (SDWs). This was followed up by a white paper covering ‘The Evolution of Production Security‘ which laid out the need for a new, zero-trust approach to security and then, in 2020, the jointly run industry lab released ‘The Evolution of Production workflows‘ which talked about SDWs.

“A software-defined workflow (SDW) uses a highly configurable set of tools and processes to support creative tasks by connecting them through software-mediated collaboration and automation.”

This SDW thread of the MovieLabs 2030 vision aims to standardise workflows at the data-model level, and, in the future, API level to allow for data to be easily exchanged and understood. Annie Chang from Universal Pictures explains that the ‘Evolution of Production Workflows’ publication deals with Tasks, Assets, Relationships and participants. If you can define your work with these four areas, you have enough information to get computers to understand the workflows and external data.

This leads to the idea of building each scene in an object model showing relationships between them. The data would describe key props and stage elements (for instance, if they were important for VFX), actors and their metadata and technical information such as colour correction, camera files and production staff.

Once modelled, a production can be viewed in many ways. For instance, the head of a studio may just be interested in high-level stories, people involved and distribution routes. Whereas a VFX producer would have a different perspective needing more detail about the scene. A VFX asset curator, on the other hand, would just need to know about the shots down to the filenames and storage locations. This work promises to allow all of these views of the same, dynamic data. So this not only improves workflows’ portability between vendor systems but is also a way of better organising any workflow irrespective of automation. Dreamworks is currently using this method of working with the aim of trying it out on live-action projects soon.

Annie finishes by explaining that there are efficiencies to be had in better organising assets. It will help reduce duplicates both by uncovering duplicate files but also stopping duplicate assets being produced. AI and similar technology will be able to sift through the information to create clips, uncover trivia and, with other types of data mining, create better outputs for inclusion in viewer content.

Sony Pictures’ Daniel De La Rosa then talks about the Avid platform in the cloud that they build in response to the Covid crisis and how cloud infrastructure was built in order of need and, often, based on existing solutions which were scaled up. Daniel makes the point the working in the cloud is different because it’s “bringing the workflows to the data” as opposed to the ‘old way’ where the data was brought to the machine. In fact, cloud or otherwise, with the globalisation of production there isn’t any way of doing things the ‘old way’ any more.

This reliance on the cloud – and to be clear, Daniel talks of multi-cloud working within the same production – does prompt a change in the security model employed. Previously a security perimeter would be set up around a location, say a building or department, to keep the assets held within safe. They could then be securely transferred to another party who had their own perimeter. Now, when assets are in the cloud, they may be accessed by multiple parties. Although this may not always happen simultaneously, through the life of the asset, this will be true. Security perimeters can be made to work in the cloud, but they don’t offer the fine-grained control that’s really needed where you really need to be able to restrict the type of access as well as who can access on a file-by-file basis. Moreover, as workflows are flexible, these security controls need to be modified throughout the project and, often, by the software-defined workflows themselves without human intervention. There is plenty of work to do to make this vision a reality.

The Q&A section covers exactly feedback from the industry into these proposals and specifications, the fact they will be openly accessible, and a question on costs for moving into the cloud. On the latter topic, Daniel said that although costs do increase they are offset when you drop on-premise costs such as rent and utilities. Tiered storage costs in the cloud will be managed by the workflows themselves just like MAMs manage asset distribution between online, near-line and LTO storage currently.

The session finishes talking about how SDWs will help automation and spotting problems, current gaps in cloud workflow tech (12-bit colour grading & review to name but one) and VFX workflows.

Watch now!
Speakers

Jim Helman Jim Helman
MovieLabs CTO
Daniel De La Rosa Daniel De La Rosa
Sony Pictures’ Vice President of Post Production, Technology Development
Annie Chang
Universal Pictures Vice President of Creative Technologies
Greg Ciaccio Moderator: Greg Ciaccio
ASC Motion Imaging Technology Council Workflow Chair
EP and Head of Production Technology & Post, ETC at University Southern California

Video: Cloud Services for Media and Entertainment: Production and Post-Production

My content producers and broadcasters have been forced into the cloud. Some have chosen remote controlling their on-prem kit but many have found that the cloud has brought them benefits beyond simply keeping their existing workflows working during the pandemic.

This video from SMPTE’s New York section looks at how people moved production to the cloud and how they intend to keep it there. The first talk from WarnerMedia’s Greg Anderson discussing the engineering skills needed to be up to the task concluding that there are more areas of knowledge in play than one engineer can bring to to the table from the foundational elements such as security, virtulisation nad networking, to DevOps skills like continuous integration and development (CI/CD), Active Directory and databases.

The good news is that whichever of the 3 levels of engineer that Greg introduces, from beginner to expert, the entry points are pretty easy to access to start your journey and upskilling. Within the company, Greg says that leaders can help accelerate the transition to cloud by allowing teams a development/PoC account which provides a ‘modest’ allowance each month for experimentation, learning and prooving ideas. Not only does that give engineers good exposure to cloud skills, but it gives managers experience in modelling, monitoring and analysing costs.

Greg finishes by talking through their work with implementing a cloud workflow for HBO MAX which is currently on a private cloud and on the way to being in the public cloud. The current system provides for 300 concurrent users doing Edit, Design, Engineering and QC workflows with asset management and ingest. They are looking to the public cloud to consolidate real estate and standardise the tech stack amongst many other drivers outlined by Greg.

Scott Bounds Architect at Microsoft Azure talks about content creation in the cloud. The objectives for Azure is to allow worldwide collaboration, speed up the time to market, allow scaling of content creation and bring improvements in security, reliability and access of data.

This starts for many by using hybrid workflows rather than a full switch to the cloud. After all, Scott says that rough cut editing, motion graphics and VFX are all fairly easy to implement in the cloud whereas colour grading, online and finishing are still best for most companies if they stay on-prem. Scott talks about implementing workstations in the cloud allowing GPU-powered workstations to be used using the remote KVM technology PCoIP to connect in. This type of workflow can be automated using Azure scripting and Terraform.

John Whitehead is part of the New York Times’ Multimedia Infrastructure Engineering team which have recently moved their live production to the cloud. Much of the output of the NYT is live events programming such as covering press conferences. John introduces their internet-centric microservices architecture which was already being worked on before the pandemic started.

The standard workflow was to have a stream coming into MCR which would then get routed to an Elemental encoder for sending into the cloud and distributed with Fastly. To be production-friendly the had created some simple-to-use web frontends for routing. For full-time remote production, John explains they wanted to improve their production quality by adding a vision mixer, graphics and closed captions. John details the solution they chose which comprised cloud-first solutions rather than running windows in the cloud.

The NYT was pushed into the cloud by Covid, but it was felt to be low risk and something they were considering doing anyway. The pandemic forced them to consider that perhaps the technologies they were waiting for had already arrived and ended up saving on Capex and received immediate returns on their investment.

Finishing up the presentations is Anshul Kapoor from Google Cloud who presents market analysis on the current state of cloud adoption and the market conditions. He says that one manifestation of the current crisis is that new live-events content is reduced if not postponed which is making people look to their archives. Some people have not yet done their archiving process, whilst some already have a digital archive. Google and other cloud providers can offer vast scale in order to process and manage archives but also machine learning in order to process, make sense and make searchable all the content.

The video ends with an extensive Q&A with the presenters.

Watch now!
Speakers

Greg Anderson Greg Anderson
Senior Systems Engineer,
WarnerMedia
Scott Bounds Scott Bounds
Media Cloud Architect,
Microsoft
John Whitehead John Whitehead
Senior Engineer, Multimedia Infrastructure Engineering,
New York Times
Anshul Kapoor Anshul Kapoor
Business Development,
Google Cloud

Video: Remote editing, storage, cloud dynamics & reopening production

The rug was pulled from under the feet of the production industry due to the pandemic, both in film and television. The scramble to finish projects and to fill TV schedules has resulted in a lot of creative ideas and a surge in remote editing. This panel looks at the benefits of this work and considers whether this will continue to be done in the future when the restrictions are lifted.

In this video, we hear from Sony, Teradici, Lou Wirth Productions, EditShare and PADEM Group on the gaping hole in workflows left by the pandemic and how the industry has bridged the gap with remote editing.

Moderated by IET Media Exec director Allan McLennan from PADEM group, we hear answers to questions like “What are the challenges moving to remote editing?”, “Can Remote Editing open up diversity in this part of the industry?” and features a look to the future in terms of new technologies for meeting the streaming demand.

“One of the challenges with a technology transition is people often need a motivation”

Stephen Tallamy, EditShare

“It’s easy to keep doing the thing you used to do until you’re forced to do it,” explains EditShare’s Stephen Tallamy. But the panel doesn’t see the pandemic as just something that forced a change, rather they see the benefits in the move towards remove editing and remote collaboration. David Rosen from Sony was positive saying that “Creative resources can be anywhere and the elimination of having to move those people to where the content it…is a significant advantage.” From his perspective, increasing numbers of customers have cloud as part of their workflow.

“Never again.” My customers are saying, “Never again will I be in a situation where I cannot get access to. my content.”

David Rosen, Sony

The panel’s discussion moves to remote editing, the practice of giving editors access to remote computers which run the editing software and have access to the relevant media. The editor’s local computer then becomes a window on to the edit suite in a different building, or in the cloud. Ian Main from Teradici, explains that a company can open an edit station up to an editor who could be anywhere in the world which is why this is such an important part of the solution to enabling work to continue in an emergency. Teradici specialises in developing and deploying high-performance remote control of PCs and Stephen Tallamy speaks from the experience of enabling remote editing using Teradici for enabling remote editing workflows on AWS and other cloud providers and data centres.

“The production side shut down, but the post-production side accelerated.”

Ian Main, Teradici
Lou Wirth, award-winning editor and producer, joins the panel as someone who has continued to edit locally. “For producers who were forced to go into a remote editing situation, they may have always been on the fence about it”, Lou says, “…If it was a good experience, they would see the advantages of it and continue.” Indeed the consensus does seem to be that much of what’s happening now will be fed back into workflows of the future even when restrictions are lifted.

Listen to the whole discussion which includes a look ahead to IBC.

Watch now!
Speakers

Ian Main Ian Main
Technical Marketing Principle,
Teradici
David Rosen David Rosen
VP, Cloud Applications & Solutions,
Sony
Stephen Tallamy Stephen Tallamy
Chief Technology Officer,
EditShare
Lou Wirth Lou Wirth
Head Story Editor,
Lou Wirth Productions
Allan McLennan Moderator: Allan McLennan
Chief Executive, Global Market Technologist, PADEM Media Group,
Executive Board Director, IET Media technology network

Video: Real-Time Remote Production For The FIFA Women’s World Cup

We hear about so many new and improved cloud products and solutions to improve production that, once in a while, you really just need to step back and hear how people have put them together. This session is just that, a look at the whole post production workflow for FOX Sports’ production of the Women’s World Cup.

This panel from the Live Streaming Summit at Streaming Media West is led by FOX Sports’ Director of Post Production, Brandon Potter as he talks through the event with three of his key vendors, IBM Aspera, Telestream and Levels Beyond.

Brandon starts by explaining that this production stood on the back of the work they did with the Men’s World Cup in Russia, both having SDI delivery of media in PAL at the IBC. For this event, all the edit crew was in LA which created problems with some fixed frame-rate products still in use in the US facility.

Data transfer, naturally is the underpinning of any event like this with a total of a petabyte of data being created. Network connectivity for international events is always tricky. With so many miles of cable whether on land or under the sea, there is a very high chance of the fibre being cut. At the very least, the data can be switched to take a different path an in that moment, there will be data loss. All of this means that you can’t assume the type of data loss, it could be seconds, minutes or hours. On top of creating, and affording, redundant data circuits, the time needed for transfer of all the data needs to be considered and managed.

Ensuring complete transfer of files in a timely fashion drove the production to auto archive of all content in real time into Amazon S3 in order to avoid long post-match ingest times of multiple hours, “every bit of high-res content was uploaded.” stated Michael Flathers, CTO of IBM Aspera.

Dave Norman, from Telestream explains how the live workflows stayed on-prem with the high-performance media and encoders and then, “as the match ended, we would then transition…into AWS”. In the cloud, the HLS proxies would then being rendered into a single mp4 proxy editing files.

David Gonzales explains the benefits of the full API integrations they chose to build their multi-vendor solution around, rather than simple watch-folders. For all platforms to know where the errors were was very valuable and was particularly useful for the remote users to know in detail where their files were. This reduces the number of times they would need to ask someone for help and meant that when they did need to ask, they had a good amount of detail to specify what the problem was.

The talk comes to a close with a broad analysis of the different ways that files were moved and cached in order to optimise the workflow. There were a mix of TCP-style workflows and Aspera’s UDP-based transfer technology. Worth noting, also, that HLS manifests needed to be carefully created to only reference chunks that had been transferred, rather than simply any that had been created. Use of live creation of clips from growing files was also an important tool, the in- and out-points being created by viewing a low-latency proxy stream then the final file being clipped from the growing file in France and delivered within minutes to LA.

Overall, this case study gives a good feel for the problems and good practices which go hand in hand with multi-day events with international connectivity and shows that large-scale productions can successfully, and quickly, provide full access to all media to their production teams to maximise the material available for creative uses.

Watch now!
Speakers

Mike Flathers Mike Flathers
CTO,
IBM Aspera
Brandon Potter Brandon Potter
Director of Post Production,
FOX Sports
Dave Norman Dave Norman
Principal Sales Engineer,
Telestream
Daniel Gonzales Daniel Gonzales
Senior Solutions Architect,
Levels Beyond