Video: An introduction to Biological Compression

The search for better codecs is ever lasting so it’s no surprise that with AI’s recent advances, we see a codec based on AI/machine learning. The AI approach not only frees up the maths from, say, upscaling using a fixed algorithm to doing it however it sees fit, but also gives it a holistic view of the image.

Considering the image as a whole whilst encoding it allows the encoder to better apportion bitrate and detail to the needed areas whereas other codecs have trouble breaking out of the procedural ‘one block at a time’ mode which tends to treat each macro block separately.

Co-founder of Deep Render Aim Christian Besenbruch gives us examples of his company’s ‘biological compression’ codec against the latest BGP codec which is an HEVC-based still image codec which delivers smaller images than JPEG.

Watch now!
Speaker

Christian Besenbruch Christian Besenbruch
Co-Founder,
Deep Render AI

Video: IP Test and Measurement for ST 2110 Systems

As the transition to IP-based transport for video, audio, and data continues. The early adopters have already demonstrated the operational and commercial benefits of COTS IP infrastructure and SMPTE ST 2110 video-over-IP standard suite becomes mature now. However, configuration and troubleshooting of IP systems requires a completely new skillset. Broadcast engineers need to gain an understanding of the technology and the new techniques required to monitor these signals.

In this video Kevin Salvidge from Leader shows what test and measurement tools you need to ensure you continue to deliver the same quality of service that can be achieved with SDI systems.

Kevin looks at the main differences between traditional and IP systems which stem as much from a move from synchronous to asynchronous infrastructure as the way you measure how well the system is working.

The following topics are covered:

  • Frame Check Sequence (FCS), Cyclic Redundancy Check (CRC)
  • Packet jitter measurement (avoiding buffer underrun)
  • Monitoring ST 2022-7 path delay between the two feeds
  • PTP synchronization (offset and delay graphs, synchronisation accuracy)
  • Checking that video, audio and ANC signals are synchronised with PTP and RTP timing measurement
  • Packet Header Information looking at MAC, IP, UDP, RTP as well as the payload
  • SFP Information (10/25 Gb, multimode / single mode etc.)
  • IP Event Log e.g. Grand Master change
  • Hybrid IP and SDI Video and Audio Test and Measurement

You can see the slides here.

Watch now!

Speaker

Kevin Salvidge
European Regional Development Manager
Leader

Video: Understanding esports production

Esports is here to stay and brings a new dimension on big events which combine the usual challenges of producing and broadcasting events at scale with less usual challenges such as non-standard resolutions and frame rates. This session from the IBC 2019 conference looks at the reality of bringing such events to life.

The talk starts with a brief introduction to some Esports-only terms before heading into the discussions starting with Simon Eicher who talks about his switch toward typical broadcast tools for Esports which has helped drive better production values and story telling. Maxwell Trauss from Riot Games explains how they incubated a group of great producers and were able keep production values high by having them working on shows remotely worldwide.

Blizzard uses the technique of using a clean ‘world feed’ which is shared worldwide for regions to regionalise it with graphics and language before then broadcasting this to the world. In terms of creating better storytelling, Blizzard have their own software which interprets the game data and presents it in a more consumable way to the production staff.

Observers are people who control in-game cameras. A producer can call out to any one of the observers. The panel talks about how separating the players from the observers from the crowd allows them to change the delay between what’s happening in the game and each of these elements seeing it. At the beginning of the event, this creates the opportunity to move the crowd backwards in time so that players don’t get tipped off. Similarly they can be isolated from the observers for the same effect. However, by the end of the game, the delays have been changed to bring everyone back into present time for a tense finale.

Corey Smith from Blizzard explains the cloud setup including clean feeds where GFX is added in the cloud. This would lead to a single clean feed from the venue, in the end. ESL, on the other hand choose to create their streams locally.

Ryan Chaply from Twitch explains their engagement models some of which reward for watching. Twitch’s real-time chat banner also changes the way productions are made because the producers have direct feedback from the viewers. This leads, day by day, to tweaks to the formats where a production may stop doing a certain thing by day three if it’s not well received, conversely when something is a hit, they can capitalise on this.

Ryan also talks about what they are weighing up in terms of when they will start using UHD. Riot’s Maxwell mentions the question of whether fans really want 4K at the moment, acknowledging it’s an inevitability, he asks whether the priority is actually having more/better stats.

The panel finishes with a look to the future, the continued adoption of broadcast into Esports, timing in the cloud and dealing with end-to-end metadata and a video giving a taste of the Esports event.

Watch now!
Speakers

Simon Eicher Simon Eicher
Executive Producer, Director of Broadcast, eSports Services,
ESL
Ryan Chaply Ryan Chaply
Senior Esports Program Manager,
Twitch
Corey Smith Corey Smith
Director, Live Operations Broadcast Technology Group,
Blizzard
Maxwell Trauss Maxwell Trauss
Broadcast Architect,
Riot Games
Jens Fischer Jens Fischer
Global Esport Specialist and Account Manager D.A.CH,
EVS

Video: A Standard for Video QoE Metrics

A standard in progress for quality of experience networks, rebufereing time etc. Under the CTA standards body wanting to create a standard around these metrics. The goal of the group is to come up with a standard set of player events, metrics & terminology around QoE streaming. Concurrent viewers, isn’t that easy to define? If the user is paused, are they concurrently viewing the video? Buffer underruns is called rebuffering, stalling, waiting. Intentionally focussing on what the viewers actually see and experience. QoS is a measurement of how well the platform is performing, not necessarily the same as what they are experiencing.

The standard has ideas of different levels. There are player properties and events which are standardised ways of signalling that certain things are happening. Also Session Metrics are defined which then can feed into Aggregate Metrics. The first set of metrics include things such as playback failure percentage, average playback stalled rate, average startup time and playback rate with the aim of setting up a baseline and to start to get feedback from companies as they implement these, seemingly simple, metrics.

This first release can be found on github.

Watch now!
Speaker

Steve Heffernan Steve Heffernan
Co-Founder, Head of Product,
Mux