Video: ATSC 3.0 OTA Meets OTT

ASTC 3.0 has taken the bold move to merge RF-delivered services with internet-delivered services. Branded as ‘NextGen TV’, the idea that viewers shouldn’t need to know which path their service comes by is a welcome shift from the days of needing to select the right input on your TV. We’ve covered here before the technical details of ATSC 3.0 but today we’re looking the practical side of delivering such a service.

In this Streaming Media video, Nadine Krefetz hosts a conversation with Madeleine Noland from ASTC, Todd Achilles from Evoca TV, Jim DeChant from News Press & Gazette Broadcasting aswell as Sassan Pejhan. They start by highlighting that one reason ATSC 3.0 was developed over the previous ATSC 1.0 is that it opens up the possibility of delivering HDR and/or UHD along with Dolby Atomos.

 

 

Given ATSC 3.0 uses the same MPEG DASH delivery that online streaming services use, one question is why use ATSC 3.0 at all. The benefit of broadcast medium is in the name. There’s extreme efficiency in reaching thousands or millions of people with one transmitter which ATSC 3.0 uses to its advantage. In ASTC 3.0’s case, transmitters typically reach 40 miles. The panel discusses the way in which you can split up your bandwidth to deliver different services with different levels of robustness. Doing this means you can have a service that targets reception on mobile devices whilst keeping a high bandwidth, more delicately modulated channel for your main service intended for delivery to the home.

Not unlike the existing technologies used by satellite and cable providers such as SkyQ in the UK, an internet connection can be used to deliver user-specific adverts which is an important monetisation option that is needed to keep in step with the streaming services that it can work in tandem with. Madeleine explains that ATSC has created APIs for apps to query TV-specific functions like whether it’s on or off but these are the only ways in which app development for ATSC 3.0 differs from other web-based app development.

Finishing up the conversation, the panel discusses the similarities and differences to 5G.

Watch now!
Speakers

Madeleine Noland Madeleine Noland
President,
Advanced Television Systems Committee Inc.
Todd Achilles Todd Achilles
CEO,
Evoca TV
Jim DeChant Jim DeChant
VP Technology,
News-Press & Gazette Broadcasting
Sassan Pejhan Sassan Pejhan
VP of Emerging Technologies,
ATEME
Nadine Krefetz Moderator: Nadine Krefetz
Contributing Editor,
Streaming Media Magazine

Video: CMAF with ByteRange – A Unified & Efficient Solution for Low Latency Streaming

Apple’s LL-HLS protocol is the most recent technology offering to deliver low-latency streams of just 2 or 3 seconds to the viewer. Before that, CMAF which is built on MPEG DASH also enabled low latency streaming. This panel with Ateme, Akamai and THEOplayer asks how they both work, their differences and also maps out a way to deliver both at once covering the topic from the perspective of the encoder manufacturer, the CDN and the player client.

We start with ATEME’s Mickaël Raulet who outline’s CMAF starting with its inception in 2016 with Microsoft and Apple. CMAF was published in 2018 and most recently received detailed guidelines for low latency best practice in 2020 from the DASH Industry Forum. He outlines that the idea of CMAF is to build on DASH to find a single way of delivering both DASH and HLS using once set of media. THe idea here is to minimise hits on the cache as well as storage. Harnessing the ISO BMFF CMAF adds on the ability to break chunks in to fragments opening up the promise of low latency delivery.

 

 

Mickaël discusses the methods of getting hold of these short fragments. If you store the fragments separately, then you double your storage as 4 fragments make up a whole segment. So it’s better to have all the fragments written as a segment. We see that Byterange requests are the way forward whereby the client asks the server to start delivering a file from a certain number of bytes into the file. We can even request this ahead of time, using a preload hint, so that the server can push this data when it’s ready.

Next we hear from Akamai’s Will Law who examines how Apples LL-HLS protocol can work within the CDN to provide either CMAF for LL-HLS from the same media files. He uses the example of a 4-second segments with four second-long parts. A standard latency player would want to download the whole 4-second segment where as a LL-HLS player would want the parts. DASH, has similar requirements and so Will focusses on how to bring all of these requirements down into the mimum set of files needed which he calls a ‘common cache footprint’ using CMAF.

He shows how byterange requests work, how to structure them and explains that, to help with bandwidth estimation, the server will wait until the whole of the byterange is delivered before it sends any data thus allowing the client to download a wire speed. Moreover a single request can deliver the rest of the segments meaning 7 requests get collapsed into 1 or 2 requests which is an important saving for CDNs working at scale. It is possible to use longer GOPs for a 4-second video clip than for 1-second parts, but for this technique to work, it’s important to maintain the same structure within the large 4-second clip as in the 1-second parts.

THEOplayer’s Pieter-Jan Speelmans takes the floor next explaining his view from the player end of the chain. He discusses support for LL-HLS across different platforms such as Android, Android TV, Roku etc. and concludes that there is, perhaps surprisingly, fairly wide support for Apple’s LL-HLS protocol. Pieter-Jan spends some time building on Will’s discussion about reducing request numbers for browsers, CORS checking can increase cause extra requests to be needed when using Byterange requests. For implementing ABR, it’s important to understand how close you are to the available bandwidth. Pieter-Jan says that you shouldn’t only use the download time to determine throughput, but also metadata from the player to get as an exact estimate as possible. We also hear about dealing with subtitles which can need to be on screen longer than the duration of any of the parts or even of the segment length. These need to be adapted so that they are shown repeatedly and each chunk contains the correct information. This can lead to flashing on re-display so, as with many things in modern players, needs to be carefully and intentionally dealt with to ensure the correct user experience.

The last part of the video is a Q&A which covers:

  • Use of HTTP2 and QUIC/HTTP3
  • Dynamic Ad Insertion for low latency
  • The importance of playlist blocking
  • Player synchronisation with playback rate adjustment
  • Player analytics
  • DRM insertion problems at low-latency

    Watch now!
    Speakers

    Will Law Will Law
    Chief Architect, Edge Technology Group,
    Akamai
    Mickaël Raulet Mickaël Raulet
    CTO,
    ATEME
    Pieter-Jan Speelmans Pieter-Jan Speelmans
    CTO & Founder,
    THEOPlayer
  • Video: Meeting the Multi-Platform, Multi-Device Challenge

    OTT’s changed over the last decade going from a technical marvel to a massive market in its own right with significant reach and technical complexity. There are now many ways to ‘goto market’ and get your content in front of your viewers. Managing the strategy, the preparation & delivery of content, as well as the player ecosystem, is a big challenge under discussion by this Streaming Media panel of experts: Ian Nock from Fairmile West, Remi Beaudouin from Ateme, Pluto TV’s Tom Schultz and Jeff Allen from ShortsTV.

    Introduced by moderator Ben Schwarz Jeff launches straight into a much-needed list of definitions. Video on demand, VOD, is well-understood subgenres are simultaneously similar and important to differentiate. AVOD means advertising-funded, SVOD is subscription-funded and TVOD, not mentioned in the video, is transactional VOD which is otherwise called Pay TV. As Jeff shows next, if you have an SVOD channel on someone else’s platform such as Amazon Prime your strategy may be different, so calling this out separately is useful. A new model has appeared called FAST which stands for ‘Free Ad-Supported TV’ which is a linear service that is streamed with dynamic ad insertion. To be clear, this is not the same as AVOD since AVOD implies choosing each and every show you want to watch. FAST simulates the feel of a traditional linear TV channel. Lastly, Jeff calls out the usefulness and uniqueness of the social platforms which are rarely a major source of income for larger companies but can form an important part in curating a following and leading viewers to your your services.

     

     

    Jeff finishes up by explaining some of the differences in strategy for launching in these different ways. For instance, for a traditional linear channel, you would want to make sure you have a large amount of new material but for an ad-supported channel on another platform, you may be much more likely to hold back content. For FAST channels, typically these are more experimentally and niche-branded. Jeff looks at real examples from the History Channel, MTV and AMC before walking through the thinking for his own fictional service.

    Next up is Ian Nock who is Chair of the Ultra HD Forum’s interoperability working group looking at how to launch a service with next-generation features such as HDR, UHD or high frame rates. He outlines the importance of identifying your customers because by doing that, you can understand the likely device population in your market, their average network performance and the prevalence of software versions. These are all big factors in understanding how you might be able to deliver your content and the technologies you can choose from to do so. For UHD, codec choice is an important part of delivery as well as the display format such as HDR10, HDR10+ etc. Ian also talks about needing a ‘content factory’ to seamlessly transcode assets into and out of next-generation formats remembering that for each UHD/HDR viewer, you’re still likely to have 10 who need SDR. Ian finishes off by discussing the delivery of higher frame rates and the importance of next generation audio.

    Wrapping up the video is Ateme’s Remi raising discussion points on the continuing need for balance between active and passive TV, the lack of customisation of TV services, increasing sensitivities on the part of both the customer and streaming providers around sharing analytics and the need to find a way to make streaming more environmentally friendly. Lastly, Tom talks about how PlutoTV is a a service which is very much based on data and though privacy is upheld as very important, decisions are very quantitative. He’s seen that, over the past year, usage patterns have changed for instance the move from mobiles to second screens (i.e. tablets). Delivering DRM to many different platforms is a challenge but he’s focused on ensuring there is zero friction for customers since it’s an AVOD service, it’s vitally important to use the analytics to identify problems, to ensure channel changes are fast and to have end-to-end playback traceability.

    Watch now!
    Speakers

    Tom Schultz Tom Schultz
    Director of Engineering – Native Apps
    Pluto TV
    Ian Nock. Ian Nock
    Founder & Principal Consultant,
    Fairmile West
    Jeff Allen Jeff Allen
    President,
    ShortsTV
    Remi Beaudouin Remi Beaudouin
    Chief strategy Officer
    ATEME
    Ben Schwarz Moderator:Ben Schwarz
    CTO,
    innovation Consulting

    Video: Examining the OTT Technology Stack

    This video looks at the whole streaming stack asking what’s now, what trends are coming to the fore and how are things going to be done better in the future? Whatever part of the stack you’re optimising, it’s vital to have a way to measure the QoE (Quality of Experience) of the viewer. In most workflows, there is a lot of work done to implement redundancy so that the viewer sees no impact despite problems happening upstream.

    The Streaming Video Alliance’s Jason Thibeault diggs deeper with Harmonic’s Thierry Fautier, Brenton Ough from Touchstream, SSIMWAVE’s Hojatollah Yeganeh and Damien Lucas from Ateme.

    Talking about Codecs, Thierry makes the point that only 7% of devices can currently support AV1 and with 10 billion devices in the world supporting AVC, he sees a lot of benefit in continuing to optimise this rather than waiting for VVC support to be commonplace. When asked to identify trends in the marketplace, moving to the cloud was identified as a big influencer that is driving the ability to scale but also the functions themselves. Gone are the days, Brenton says, that vendors ‘lift and shift’ into the cloud. Rather, the products are becoming cloud-native which is a vital step to enable functions and products which take full advantage of the cloud such as being able to swap the order of functions in a workflow. Just-in-time packaging is cited as one example.

    Examining the OTT Technology Stack from Streaming Video Alliance on Vimeo.

    Other changes are that server-side ad insertion (SSAI) is a lot better in the cloud and sub partitioning of viewers, where you do deliver different ads to different people, is more practical. Real-time access to CDN data allowing you near-immediate feedback into your streaming process is also a game-changer that is increasingly available.

    Open Caching is discussed on the panel as a vital step forward and one of many areas where standardisation is desperately needed. ISPs are fed up, we hear, of each service bringing their own caching box and it’s time that ISPs took a cloud-based approach to their infrastructure and enabled multiple use servers, potentially containerised, to ease this ‘bring your own box’ mentality and to take back control of their internal infrastructure.

    HDR gets a brief mention in light of the Euro soccer championships currently on air and the Japan Olympics soon to be. Thierry says 38% of Euro viewership is over OTT and HDR is increasingly common, though SDR is still in the majority. HDR is more complex than just upping the resolution and requires much more care over which screen it’s watched. This makes adopting HDR more difficult which may be one reason that adoption is not yet higher.

    The discussion ends with a Q&A after talking about uses for ‘edge’ processing which the panel agrees is a really important part of cloud delivery. Processing API requests at the edge, doing SSAI or content blackouts are other examples of where the lower-latency response of edge compute works really well in the workflow.

    Watch now!
    Speakers

    Thierry Fautier Thierry Fautier
    VP Video Strategy.
    Harmonic Inc.
    Damien Lucas Damien Lucas
    CTO,
    Ateme
    Hojatollah Yeganeh Hojatollah Yeganeh
    Research Team Lead
    SSIMWAVE
    Brenton Ough Brenton Ough
    CEO & Co-Founder,
    Touchstream
    Jason Thibeault Moderator: Jason Thibeault
    Executive Director,
    Streaming Video Alliance