Webinar: Networking Fundamentals


Date: Thursday 12th December, 1pm EST / 18:00 GMT

Networking is increasingly important throughout the broadcast chain. This webcast picks out the fundamentals that underpin SMPTE ST 2110 and that help deliver video streaming services. We’ll piece them together and explain how they work, leaving you with more confidence in talking about and working with technologies such as multicast video and HTTP Live Streaming (HLS).

Register now!
Speaker

Russell Trafford-Jones Russell Trafford-Jones
Editor, https://TheBroadcastKnowledge.com
Manager, Support & Services, Techex

Video: SMPTE Technical Primers

The Broadcast Knowledge exists to help individuals up-skill whatever your starting point. Videos like this are far too rare giving an introduction to a large number of topics. For those starting out or who need to revise a topic, this really hits the mark particularly as there are many new topics.

John Mailhot takes the lead on SMPTE 2110 explaining that it’s built on separate media (essence) flows. He covers how synchronisation is maintained and also gives an overview of the many parts of the SMPTE ST 2110 suite. He talks in more detail about the audio and metadata parts of the standard suite.

Eric Gsell discusses digital archiving and the considerations which come with deciding what formats to use. He explains colour space, the CIE model and the colour spaces we use such as 709, 2100 and P3 before turning to file formats. With the advent of HDR video and displays which can show bright video, Eric takes some time to explain why this could represent a problem for visual health as we don’t fully understand how the displays and the eye interact with this type of material. He finishes off by explaining the different ways of measuring the light output of displays and their standardisation.

Yvonne Thomas talks about the cloud starting by explaining the different between platform as a service (PaaS), infrastructure as a service (IaaS) and similar cloud terms. As cloud migrations are forecast to grow significantly, Yvonne looks at the drivers behind this and the benefits that it can bring when used in the right way. Using the cloud, Yvonne shows, can be an opportunity for improving workflows and adding more feedback and iterative refinement into your products and infrastructure.

Looking at video deployments in the cloud, Yvonne introduces video codecs AV1 and VVC both, in their own way, successors to HEVC/h.265 as well as the two transport protocols SRT and RIST which exist to reliably send video with low latency over lossy networks such as the internet. To learn more about these protocols, check out this popular talk on RIST by Merrick Ackermans and this SRT Overview.

Rounding off the primer is Linda Gedemer from Source Sound VR who introduces immersive audio, measuring sound output (SPL) from speakers and looking at the interesting problem of forward speakers in cinemas. The have long been behind the screen which has meant the screens have to be perforated to let the sound through which interferes with the sound itself. Now that cinema screens are changing to be solid screens, not completely dissimilar to large outdoor video displays, the speakers are having to move but now with them out of the line of sight, how can we keep the sound in the right place for the audience?

This video is a great summary of many of the key challenges in the industry and works well for beginners and those who just need to keep up.

Watch now!
Speakers

John Mailhot John Mailhot
Systems Architect for IP Convergence,
Imagine Communications
Eric Gsell Eric Gsell
Staff Engineer,
Dolby Laboratories
Linda Gedemer, PhD Linda Gedemer, PhD
Technical Director, VR Audio Evangelist
Source Sound VR
Yvonne Thomas Yvonne Thomas
Strategic Technologist
Digital TV Group

Video: How speakers and sound systems work: Fundamentals, plus Broadcast and Cinema Implementations

Many of us know how speakers work, but when it comes to phased arrays or object audio we’re losing our footing. Wherever you are in the spectrum, this dive into speakers and sound systems will be beneficial.

Ken Hunold from Dolby Laboratories starts this talk with a short history of sound in both film and TV unveiling the surprising facts that film reverted from stereo back to mono around the 1950s and TV stayed mono right up until the 80s. We follow this history up to now with the latest immersive sound systems and multi-channel sound in broadcasting.

Whilst the basics of speakers are fairly widely known, Ken with looking at how that’s set up and the different shapes and versions of basic speakers and their enclosures then looking at column speakers and line arrays.

Multichannel home audio continues to offer many options for speaker positioning and speaker type including bouncing audio off the ceilings, so Ken explores these options and compares them including the relatively recent sound bars.

Cinema sound has always been critical to the effect of cinema and foundational to the motivation for people to come together and watch films away from their TVs. There have long been many speakers in cinemas and Ken charts how this has changed as immersive audio has arrived and enabled an illusion of infinite speakers with sound all around.

In the live entertainment space, sound, again, is different where the scale is often much bigger and the acoustics so much different. Ken talks about the challenges of delivering sound to so many people, keeping the sound even throughout the auditorium and dealing with delay of the relatively slow-moving sound waves. The talk wraps up with questions and answers.

Watch now!

Speakers

Ken Hunold Ken Hunold
Sr. Broadcast Services Manager, Customer Engineering
Dolby Laboratories, Inc.

Video: Intro to 4K Video & HDR

With all the talk of IP, you’d be wrong to think SDI is dead. 12G for 4K is alive and well in many places, so there’s plenty of appetite to understand how it works and how to diagnose problems.

In this double-header, Steve Holmes from Tektronix takes us through the ins and outs of HDR and also SDI for HDR at the SMPTE SF section.

Steve starts with his eye on the SMPTE standards for UHD SDI video looking at video resolutions and seeing that a UHD picture can be made up of 4 HD pictures which gives rise to two well known formats ‘Quad split’ and ‘2SI’ (2 Sample Interleave).

Colour is the next focus and a discussion on the different colour spaces that UHD is delivered with (spoiler: they’re all in use), what these look like on the vector scope and look at the different primaries. Finishing up with a roundup and a look at interlink timing, there’s a short break before hitting the next topic…HDR

Watch now!

High Dynamic Range is an important technology which is still gaining adoption and is often provided in 4K programmes. Steve defines the two places HDR is important; in the acquisition and the display of the video then provides a handy lookup table of terms such as HDR, WCG, PQ, HDR10, DMCVT and more.

Steve gives us a primer on what HDR is in terms of brightness ‘NITS’, how these relate to real life and how we talk about it with respect to the displays. We then look at HDR on the waveform monitor and look at features of waveform monitors which allow engineers to visualise and check HDR such as false colour.

The topic of gamma, EOTFs and colour spaces comes up next and is well explained building on what came earlier. Before the final demo and Q&A, Steve talks about different ways to grade pictures when working in HDR.

A great intro to the topics at hand – just like Steve’s last one: Uncompressed Video over IP & PTP Timing

Watch now!

Speakers

Steve Holmes Steve Holmes
Senior Applications Engineer,
Tektronix