Video: Edge Compute and the CDN

Requests to servers are returned only tens of milliseconds later which is hardly any time to wait. But they quickly add up to users waiting seconds for their player to find out what video it wants, get it and finally start showing it. We all know that time is money when it comes to people ‘pressing play’ so reducing this startup time.

Anime streaming service, CrunchyRoll went to task to reduce their startup time. Michael Dale, VP of Engineering there, sits with Heather Chamberlin Mellish from AWS to describe how they used AWS to optimise the communications needed to establish a streaming session,

Named Katana, the project looked at the 12+ requests involved between third parties and the player itself which were all needed to start the session. Advertising companies need to be consulted, streaming manifest files need to represent chunks from multiple CDNs, SSAI and metrics were done with third-party vendors and the service is protected with DRM. These are just some of the factors which led to so many return trips needing to be accomplished before shipping.

This talk provides an overview and a little bit of a ‘behind the scenes’ of a blog post which also covers this project.

Key to success was deploying on AWS [email protected] which is a server which allows you to run code within AWS’s CloudFront. If you have Python or Javascript, this allows you to run it at the edge server closest to the user. For Crunchyroll’s global audience, this is particularly useful as it avoids having to set up infrastructure in every one of the AWS regions but still reduces much of the return trip time. Michael explains that, although Lambda is often viewed as an ephemeral service, when it’s not in use it can be suspended and used again in the future allowing it to maintain state for a player.

Michael explains the ways in which Katana has achieved success. Many of the third-party services have been brought into [email protected] and AWS. DRM and Advertising are still third-party, but doing most things within the edge and also pre-emptively returning information such as manifests has removed many requests. The video breaks down their use of GraphQL and how Multi-CDN and SSAI workflows have been implemented.

Watch now!

Michael Dale Michael Dale
VP Engineering,
Heather Mellish Heather Chamberlin Mellish
Principal Edge Go To Market Specialist,
AWS Amazon Web Services

Video: Layer 4 in the CDN

Caching is a critical element of the streaming video delivery infrastructure, but with the proliferation of streaming services, managing caching is complex and problematic. Open Caching is an initiative by the Streaming Video Alliance to bring this under control allowing ISPs and service providers a standard way to operate.

By caching objects as close to the viewer as possible, you can reduce round-trip times which helps reduce latency and can improve playback but, more importantly, moving the point at which content is distributed closer to the customer allows you to reduce your bandwidth costs, and create a more efficient delivery chain.

This video sees Disney Streaming Services, ViaSat and Stackpath discussing Open Caching with Jason Thibeault, Executive Director of the Streaming Video Alliance. Eric Klein from Disney explains that one driver for Open Caching is from content producers which find it hard to scale, to deliver content in a consistent manner across many different networks. Standardising the interfaces will help remove this barrier of scale. Alongside a drive from content producers, are the needs of the network operators who are interested in moving caching on to their network which reduces the back and forth traffic and can help cope with peaks.

Dan Newman from Viasat builds on these points looking at the edge storage project. This is a project to move caching to the edge of the networks which is an extension of the original open caching concept. The idea stretches to putting caching directly into the home. One use of this, he explains, can be used to cache UHD content which otherwise would be too big to be downloaded down lower bandwidth links.

Josh Chesarek from StackPath says that their interest in being involved in the Open Caching initiative is to get consistency and interoperability between CDNs. The Open Caching group is looking at creating these standard APIs for capacity, configuration etc. Also, Eric underlines the interest in interoperability by the close work they are doing with the IETF to find better standards on which to base their work.

Looking at the test results, the average bitrate increases by 10% when using open caching, but also a 20-40% improvement in connection use rebuffer ratio which shows viewers are seeing an improved experience. Viasat have used multicast ABR plus open caching. This shows there’s certainly promise behind the work that’s ongoing. The panel finishes by looking towards what’s next in terms of the project and CDN optimisation.

Watch now!

Eric Klein Eric Klein
Director, CDN Technology,
Dan Newman Dan Newman
Product Manager,
Josh Chesarek Josh Chesarek
VP, Sales Engineering & Support
Jason Thibeault Jason Thibeault
Executive Director, Streaming Video Alliance

Video: State of the Streaming Market

Streaming Media commissioned an extra mid-year update to their ‘State of the Streaming Market’ survey in order to understand how the landscape has changed due to COVID-19. With a survey already carried out once this year, this special Autumn edition captures the rapid changes we’ve been seeing.

Tim Siglin talks us through the results of the survey ahead of a full report being published. Since the last set of questioning the amount of live vs OTT in the businesses that responded has swung around 5% in favour of live content. The survey indicates that 65% of streaming infrastructure will be software-defined within 24 months, with some adopting a hybrid approach initially.

Tim also unveils a very striking graphic showing 56% of respondents see the internet being their company’s main way of transporting video via IP dwarfing the other answers, the biggest of which is CDN with 25% which covers delivery to CDN by dedicated links or internet links within the cloud.

Zixi is part of the RIST Forum and the SRT alliance, which indicates they understand the importance of multiple-codec workflows. We see the streaming industry is of the same opinion with more than two-thirds expecting to be using multiple protocols over the next twelve months,

Looking at the benefits of moving to the cloud, flexibility is number one, cost savings at three and supporting a virtualised workforce being five. Tim mentions surprise at seeing a remote workforce being only at number five but does suggest without the pandemic it would not have entered the top five at all. This seems quite reasonable as, whatever your motivation for starting using the cloud, flexibility is nearly always going to be one of the key benefits.

Reliability was ranked number two in ‘benefits of moving to the cloud’. The reasons for people choosing that were fairly evenly split with the exception of uptime being 39%. Quality of Service, Quality of Experience and cost all came in around 20%.

Tim Siglin and Gordon Brooks discuss how 5G will impact the industry. Gordon gives a business-to-business example of how they are currently helping a broadcaster contribute into the cloud and then deliver to and end-point all with low-latency. He sees these links as some of the first to ‘go 5G’. In terms of the survey, people see ‘in venue delivery’ as half as likely to be useful for video streaming than distribution to the consumer or general distribution. Tim finishes by saying that although it could well be impactful to streaming, we need to see how much of the hype the operators actually live up to before planning too many projects around it.

Watch now!

Tim Siglin Tim Siglin
Founding Executive Director
Gordon Brooks Gordon Brooks
Eric Schumacher-Rasmussen Moderator: Eric Schumacher-Rasmussen
Editor, Streaming Media

Video: Low Latency Live Streaming At Scale

Low latency can be a differentiator for a live streaming service, or just a way to ensure you’re not beaten to the punch by social media or broadcast TV. Either way, it’s seen as increasingly important for live streaming to be punctual breaking from the past where latencies of thirty to sixty seconds were not uncommon. As the industry has matured and connectivity has enough capacity for video, simply getting motion on the screen isn’t enough anymore.

Steve Heffernan from MUX takes us through the thinking about how we can deliver low latency video both into the cloud and out to the viewers. He starts by talking about the use cases for sub-second latency – anything with interaction/conversations – and how that’s different from low-latency streaming which is one to many, potentially very large scale distribution. If you’re on a video call with ten people, then you need sub-second latency else the conversation will suffer. But distributing to thousands or millions of people, the sacrifice in potential rebuffering of operating sub-second, isn’t worth it, and usually 3 seconds is perfectly fine.

Steve talks through the low-latency delivery chain starting with the camera and encoder then looking at the contribution protocol. RTMP is still often the only option, but increasingly it’s possible to use WebRTC or SRT, the latter usually being the best for streaming contribution. Once the video has hit the streaming infrastructure, be that in the cloud or otherwise, it’s time to look at how to build the manifest and send the video out. Steve talks us through the options of Low-Latency HLS (LHLS) CMAF DASH and Apple’s LL-HLS. Do note that since the talk, Apple removed the requirement for HTTP/2 push.

The talk finishes off with Steve looking at the players. If you don’t get the players logic right, you can start off much farther behind than necessary. This is becoming less of a problem now as players are starting to ‘bend time’ by speeding up and slowing down to bring their latency within a certain target range. But this only underlines the importance of the quality of your player implementation.

Watch now!

Steve Heffernan Steve Heffernan
Founder & Head of Product, MUX
Creator of video.js