The Broadcast Knowledge
Menu Close
  • Diversity in Broadcast
    • Diversity in Broadcast – The Problems and Opportunities
    • Rise – Women in Broadcast
    • Diversity Index
  • Twitter
  • LinkedIn
  • Facebook
  • Email
  • Submissions

metadata

Video: Production Metadata Workflows & Standards for Media & Entertainment

Posted on 25th June 2021 by Russell Trafford-Jones

Metadata has long been understood as important, particularly by archivists. But it’s only in recent years that a renewed focus has been put on metadata. As the large broadcasters and media houses have expanded and as streaming has become a bigger and bigger proportion of output, it’s become increasingly clear that a coherent, already-executed metadata strategy is essential. Recently, A+E had 5 weeks to transfer much of their inventory to Discovery+ which was achieved largely thanks to a cloud-to-cloud transfer and ingest workflow which not only checked the integrity of the media but also ensured metadata was correctly transferred.

In this video from SMPTE New York, we hear from WarnerMedia’s Yonah Levenson and Dave Satin about metadata and workflows. Yonah starts by defining the different types of metadata: Descriptive (e.g. time of day), Technical (e.g. File size, Compression), Administrative (e.g. Who created, last modified, location), Usage (e.g. when last used. how many times, Structural (e.g. Page numbers. Scene numbers), Preservation (e.g. location of file, checksum, retention policy) and Rights (e.g. Licensing, Permissions, Contracts, who signed off on rights).

Yonah gives examples on all of the above and then discusses standards and best practices for metadata such as starting with a spreadsheet, ensuring you get the right governance from the start, writing every decision down and getting a whole range of people involved to ensure your metadata plan is going to suit the company as a whole and not just your part of it.

 

 

Dave Satin takes the floor next and shows how the logical technical blocks of the video workflow interact with external software and services. On top of this, he then overlays the people needed to run these functions as well as the problems that generally come up. His over-archine point being that complex supply chains cause complex problems. He continues to highlight metadata pain points and which workflow problems they give rise to.

Dave finishes on a very compelling iceberg graphic showing much of the required functions in metadata lie below the surface of obvious requirements. The trick is to identify and adopt them into your workflows before crashing.

Yonah returns to the stage to introduce the Language Metadata Table, also known as LMT. First started in 2017, it’s grown to describing 239 languages with a further 150 on the way. Speaking earlier this year, Yonah explained “People speak different languages…you don’t know what they’re saying and we have this problem in the industry where there really hasn’t been a definitive, unified standard of language codes.” LMT addresses this problem and works for Audio, Visual (subs, narratives, Sign language), Content Licensing
Rights, Acquisition, Distribution content, Localisation preferences.

This is ongoing work between MESA and SMPTE to get this standardised. The working group will meet again soon.

The session closes with a long Q&A session which includes a discussion of camera-originated metadata, maximising metadata, when to start collecting metadata (when money starts being spent), use of metadata to tag ‘sprites’ used in films so the sprite can be localised to national sensitivities/cultural norms, finding ‘homes’ for metadata such as standards like AXF, logging regional restrictions on archive news footage, differences in camera-collected metadata, the importance of having the right metadata during obituaries, metadata conservation for archives intended for 150 years in the future.

Watch now!
Speakers

Will Kreth Will Kreth
Meta Data Systems & Simply.TV
DDEX WG co-chair
Yonah Levenson Yonah Levenson
Manager, Metadata Strategy & Taxonomy Governance,
WarnerMedia / HBOMax
James Snyder James Snyder
Library of Congress and SMPTE board member
Dave Satin Dave Satin
Digital Imaging Technician, HDR & 4K expert
Video Archive, Dave Satin, James Snyder, LMT, MESA, metadata, SMPTE, WarnerMedia, Will kreth, Yonah Levenson

Video: Video Fundamentals in Depth

Posted on 23rd April 2021 by Russell Trafford-Jones

Whether you feel your productivity has gone up down during Covid lockdowns, most people seem to be busier than ever making it all the harder to get the depth of knowledge needed to change from knowing how to get things done to know why it works and being able to adapt to new demands. However you work with video, whether in the edit suite, in a transmission MCR or otherwise, knowing how video is processed, represented and stored is really valuable and can speed up your work and faultfinding.

Noah Chamow from the Assistant Editor’s Bootcamp, is here with a true video fundamentals course. In four sections, this three-hour course takes you through how images are structured, compressed, stored and looks at colour. The first portion is a look at resolution, aspect ratio (including cameras, lenses and delivery aspect ratios), interlaced images, frame rates and timecode.

The codecs section explains the basics of compression and then moves on to post-production codecs such as ProRes, DNxHD etc. before moving on to looking at the bitrates contained within. Colour is a big topic which Noah introduces with standard 2D and 3D colour charts building up to the idea of colour depth such as 8 bit, 10 bit etc. If you’re interested in more detail on colour spaces and fundamentals, this SMPTE deep dive should deliver the goods. Noah moves on to HDR next, talking about both the extended luminance brightness as well as the Wide Colour Gamut aspects such as BT.2020 which all links back to his previous section about on image bit depth.

For a deeper dive into codecs, check out this SMPTE video and for a closer look at HDR, Tektronix’s Steve Holmes does so in this video.

 

 

Important in the post-production side of the industry is understanding formats like Blackmagic RAW and ARRIRAW which are discussed. They are large files that can push computers to the extremes and need processing such as debayering. Chroma sampling is an important concept as all of the video getting to viewers has had colour information removed. If you’re interested in editing, you need to understand this so you know which files will give you the best output. If you’re in contribution or transmission, it’s important to know the difference between video coming to you with either 4:2:2 or 4:2:0. Noah explains what all this means and why we do this in the first place.

The video concludes by looking at video containers. Noah starts by talking about the Mov, also known as ‘Quicktime’, container. If you have and ‘off-beat’ sense of humour you may enjoy this video on containers and codecs from Adult Swim. With a lot more decorum, Noah covers MXF and moving between container formats in your editing software.

Whether you’ve been asking “Which codec should I use?”, struggling to understand steps in your workflow or just trying to find ways to get your work done better for faster, this training session should help deepen your understanding of video giving you the context that you need.

Watch now!
Speaker

Noah Chamow Noah Chamow
Co-Founder, Assistant Editors’ Bootcamp
Assistant Online Editor, Level 3 Post
Video Aspect Ratio, Assistant Editors' Bootcamp, BT. 2020, chroma, codecs, Colour Space, Colour Volume, Compression, Frame rate, HDR, Luma, metadata, MXF, Noah Chamow, Timecode, Video Fundamentals, WCG

Video: Audio Metadata over IP

Posted on 6th August 2020 by Russell Trafford-Jones

Next-Generation Audio is gradually becoming this generation’s audio as new technologies seep into the mainstream. Dolby Atmos is one example of a technology which is being added to more and more services and which goes way beyond stereo and even 5.1 surround sound. But these technologies don’t just rely on audio, they need data, too to allow the decoders to understand the sound so they can apply the needed processing. It’s essential that this data, called metadata, keeps in step with the audio and, indeed, that it gets there in the first place.

Dolby have long used metadata along with surround sound to maintain the context in which the recording was mastered. There’s no way for the receiver to know what maximum audio level the recording was mixed to without being told, for instance. With NGA, the metadata needed can be much more complex. With Dolby Atmos, for example, the audio objects need position information along with the mastering information needed for surround sound.

Kent Terry from Dolby laboratories joins us to discuss the methods, both current and future, that we can use to convey metadata from point to point in the broadcast chain. He starts by looking at the tried and trusted methods of carrying data within the audio of SDI. This is the way that Dolby E and Dolby D are carried, as data within what appears to be an AES 3 stream. There are two SMPTE standards for this in a sample-accurate fashion, ST 2109 and ST 2116.

SMPTE 2109 allows for metadata to be carried over an AES 3 channel using SMPTE ST 337, 337 being the standard which defines how to put compressed audio over AES 3 which would normally expect PCM audio data. This allows for any metadata at all to be carried. SMPTE ST 2116, similarly, defines metadata transport over AES3 but specifically for ITU-R BS.2125 and BS.2076 which define how to carry the Audio Definition Model.

The motivation for these standards is to enable live workflows which don’t have a great way of delivering live metadata. There are a few types of metadata which are worth considering. Static metadata, which doesn’t change during the programme such as the number of channels or the sample rate. Dynamic metadata such as spacial location and dialogue levels. And importantly optional metadata and required metadata, the latter being essential for the functioning of the technology.

Kent says that live productions are held back in their choice of NGA technologies by the limitations of metadata carriage and this is one reason that work is being done in the IP space to create similar standards for all-IP programme production.

For IP there are two approaches. The first is to define a way to send metadata separately to the AES67 audio which is found within SMPTE ST 2110-30, which is done with the new AES standard AES-X242. The other way being developed is using SMPTE 2110-41 which allows for any metadata (not solely ST 292) to be carried in a time-synchronised way with the other 2110 essences. Both of these methods, Kent explains are actively being developed and are open to input from users.

Watch now!
Speaker

Kent Terry Kent Terry
Snr. Manager, Sound Technology, Office of the CTO,
Dolby Laboratories
Video AES-X242, Dolby, Dolby Laboratories, Kent Terry, metadata, Next Generation Audio, NGA, ST 2109, ST 2110-41, ST 2116, VSF

Video: ST 2110-41 Fast Metadata – Under the Hood and Applications

Posted on 1st July 2020 by Russell Trafford-Jones

Why would you decode 12Gbps of data in order to monitor for an occasional SCTE switching signal or closed captions? SMPTE 2110 separates video, audio and metadata freeing up workflows to be more flexible. But so far, SMPTE 2110 only defines how to take the metadata used in SDI under SMPTE 291. The point of freeing up workflows is to allow innovation, so this talk looks at SMPTE ST 2110-41 which allows for more types of data to be carried, time synchronised with the other essences.

Paul Briscoe joins us at the podium to explain why we need to extend SMPTE ST 2110-40 beyond SDI-compatible metadata. He starts by explaining how ST 2110 was intended to work well with SDI. Since SDI already has a robust system for Ancillary Data (ANC Data), it made sense for 2110 to support that first in a 100% compatible way. One of the key characteristics of metadata is that it’s synchronised to the other media and we expect the decoders to be selective over what it acts on, explains Paul. The problem that he sees, however is that if you wanted to send some XML, JSON or other data which has never been included in the SDI signal, there is no way to send that within 2110 and keep the synchronisation. To prove his point, Paul puts up the structure of ST-291M in 2110-40 which still has references to lines and horizontal offsets. Furthermore, he points out that future 2110 workflows aren’t going to need to be tied to SDI-compatible metadata.

https://www.youtube.com/watch?v=F0B2yJFXdrE

The Fast Metadata (FMX) proposal is to allow arbitrary metadata streams which can be time-aligned with a media stream. It would provide a standardised encoding method, be extensible, minimise IP address use and retain the option of being interoperable with ST 291 if needed.

Having chosen the KLV data structure, which is well known and scalable, SMPTE provides for this to be delivered with a timestamp and even early so that processing can be done ahead of time. This opens the door to carrying XMl, JSON and other data structure.

Paul explains how this has been implemented as an RTP stream hence using the RTP timestamps. Within the stream there are blobs of data. Paul explains the structure of a blob and how payloads which are smaller than a blob, as well as those which are larger, can be dealt with. Buffering and SDPs need to be supported, after all, this is a 2110 stream.

After doing into the detail of 2110-41, Paul explains there is a 2110-42 standard which can carry technical metadata about the stream. It provides in-band messaging of important stream details but is not intended to replace IS-04 or SDPs.

Find out more and Watch now!
Download the presentation
Speakers

Paul Briscoe Paul Briscoe
Televisionary Consulting
Video Evertz, IPShowcase, metadata, Paul Briscoe, SMPTE, ST 2110-40, ST 2110-41, ST 2110-42

Post navigation

Older Articles

Get Updates by Email

Put in your email address to receive new posts by email.

We'll send you an email each time there is a new article and may occasionally send you other emails. By signing up, you agree to our Privacy Policy

Check your inbox or spam folder to confirm your subscription.

Popular Posts

  • Video: Red and Blue, or Purple; Your IP Media Network, Your Way
  • Video: PTP/ST 2059 Best Practices developed from PTP deployments and experiences
  • Video: PTP Over Wan
  • Video: Proper Network Designs and Considerations for SMPTE ST-2110
  • Video: AES67 Over Wide Area Networks

Diversity in Broadcast

Find out about Rise, an organisation which promotes gender diversity for women in technical roles throughout the industry.

Search

About This Site

The Broadcast Knowledge aggregates all the Broadcast industry’s free, educational webinars into one place with daily updates.

Recent Posts

  • Video: ATSC 3.0 OTA Meets OTT
    27th July 2021
  • Video: Remote Production
    26th July 2021
  • Videos: Standards – What are they and how are they changing?
    23rd July 2021
  • Video: IPMX Makes Networks Easy
    22nd July 2021
  • Video: A Cloudy Future For Post Production
    21st July 2021
  • Video: The ROI of Deploying Multiple Codecs
    20th July 2021

Tags

AES67 (33) AI (27) AV1 (59) AVC (35) AWS (33) Bitmovin (39) CDN (26) Cloud (59) CMAF (38) codecs (65) DASH (27) Demuxed (27) DVB (25) Encoding (38) fundamentals (53) Harmonic Inc. (25) HDR (52) HEVC (56) HLS (30) IBC (40) IBC365 (25) IEEE 1588 (25) IP (47) IPShowcase (56) Live Streaming (115) Low latency (54) Mile High Video (25) MPEG DASH (25) NMOS (29) OTT (178) PTP (64) Remote Production (32) RIST (25) SMPTE (106) SRT (28) ST 2059 (29) ST 2110 (136) streaming (125) streaming media (59) Streaming Video Alliance (26) UHD (27) VoD (25) VSF (42) WebRTC (26) Workflows (66)

Subscribe to get daily updates

The Broadcast Knowledge links to free educational videos & webinars focused on the Broadcast Industry.

We'll send you an email each time there is a new article and may occasionally send you other emails. By signing up, you agree to our Privacy Policy.

Thanks! Check your inbox or spam folder to confirm your subscription.

Views and opinions expressed on this website are those of the author(s) and do not necessarily reflect those of SMPTE or SMPTE Members. This website is presented for informational purposes only. Any reference to specific companies, products or services does not represent promotion, recommendation, or endorsement by SMPTE Powered by SMPTE
© 2025 The Broadcast Knowledge. All rights reserved.
Hiero by aThemes
 

Loading Comments...