Webinar: Low-Latency CMAF vs. WebRTC

CMAF brings low latency streams of less than 4 seconds into the realms of possibility, WebRTC pushes that below a second – but which is the right technology for you?

Date: June 12th 2019 Time: 11am PST / 2pm EST / 19:00 BST

CMAF represents an evolution of the tried and tested technologies HLS and DASH. With massive scalability and built upon the well-worn tenants of HTTP, Netflix and a whole industry was born and is thriving on these still-evolving technologies. The push to reduce latency further and further has resulted in CMAF which can be used to deliver streams with five to ten times lower latencies.

WebRTC is a Google-backed streaming protocol with the traditional meaning of streaming; it pushes a stream to you a opposed to the HLS-style methods of making small files available for download and reassembly into a stream. One benefit of this is extremely low bitrates of 1 second or less. Used widely by Google Hangouts and Facebook messenger, WebRTC is increasingly an option for more broadcast-style streaming services from live sports & music to gaming and gambling.

Both have advantages and draw-backs so Wowza’s Barry Owen and Anne Balistreri are here to help navigate the ins and outs of both technologies plus answer your questions.

Register now!

Speakers

Barry Owen Barry Owen
VP, Professional Services,
Wowza
Anne Balistreri Anne Balistreri
Product Marketing Manager,
Wowza

Video: An Overview of the ISO Base Media File Format

ISO BMFF a standardised MPEG media container developed from Apple’s Quicktime and is the basis for cutting edge low-latency streaming as much as it is for tried and trusted mp4 video files. Here we look into why we have it, what it’s used for and how it works.

ISO BMFF provides a structure to place around timed media streams whilst accommodating the metadata we need for professional workflows. Key to its continued utility is its extensible nature allowing additional abilities to be added as they are developed such as adding new codecs and metadata types.

ATSC 3.0’s streaming mechanism MMT is based on ISO BMFF as well as the low-latency streaming format CMAF which shows that despite being over 18 years old, the ISO BMFF container is still highly relevant.

Thomas Stockhammer is the Director of Technical Standards at Qualcomm. He explains the container format in structure and origin before explaining why it’s ideal for CMAF’s low-latency streaming use case, finishing off with a look at immersive media in ISO BMFF.

Watch now!

Speaker

Thomas Stockhammer Thomas Stockhammer
Director Technical Standards,
Qualcomm

Video: Sub-Second Live Streaming: Changing How Online Audiences Experience Live Events

There are two main modern approaches to low-latency live streaming, one is CMAF which used fragmented MP4s to allow frame by frame delivery of chunks of data. Similar to HLS, this is becoming a common ‘next step’ for companies already using HLS. Keeping the chunk size down reduces latency, but it remains doubtful if sub-second streaming is practical in real world situations.

Steve Miller Jones from Limelight explains the WebRTC solution to this problem. Being a protocol which is streamed from the source to the destination, this is capable of sub-second latency, too, and seems a better fit. Limelight differentiate themselves on offering a scalable WebRTC streaming service with Adaptive Bitrate (ABR). ABR is traditionally not available with WebRTC and Steve Miller Jones uses this as an example of where Limelight is helping this technology achieve its true potential.

Comparing and contrasting Limelight’s solution with HLS and CMAF, we can see the benefit of WebRTC and that it’s equally capable of supporting features like encryption, Geoblocking and the like.

Ultimately, the importance of latency and the scalability you require may be the biggest factor in deciding which way to go with your sub-second live streaming.

Watch now!

Speakers

Steve Miller-Jones Steve Miller-Jones
VP Product Strategy,
Limelight Networks

Video: Using CMAF to Cut Costs, Simplify Workflows & Reduce Latency

There are two ways to stream video online, either pushing from the server to the device like WebRTC, MPEG transport streams and similar technologies, or allowing the receiving device to request chunks of the stream which is how the majority of internet streaming is done – using HLS and similar formats.

Chunk-based streaming is generally seen as more scalable of these two methods but suffers extra latency due to buffering several chunks each of which can represent between 1 and, typically, 10 seconds of video.

CMAF is one technology here to change that by allowing players to buffer less video. How does this achieve this? An, perhaps more important, can it really cut costs? Iraj Sodagar from NexTreams is here to explain how in this talk from Streaming Media West, 2018.

Iraj covers:

  • A brief history of CMAF (Common Media Format)
  • The core technologies (ISO BMFF, Codecs, captions etc.)
  • Media Data Object (Chunks, Fragments, Segments)
  • Different ways of video delivery
  • Switching Sets (for ABR)
  • Content Protection
  • CTA WAVE project
  • Wave content specifications
  • Live Linear Content with Wave & CMAF
  • Low-latency CMAF usage
  • HTTP 1.1 Chunked Transfer Encoding
  • MPEG DASH

Watch now!

Speaker

Iraj Sodagar Iraj Sodagar
Independant Consultant
Multimedia System Architect, NexTreams