Video: CDNs: Delivering a Seamless and Engaging Viewing Experience

This video brings together broadcasters, telcos and CDNs to talk about the challenges of delivering a perfect streaming experience to large audiences. Eric Klein from Disney+ addresses the issues along with Fastly’s Gonzalo de la Vega, Jorge Hernandez from Telefonica, Adriaan Bloem with Shahid moderated by Robert Ambrose.

Eric starts by talking from the perspective of Disney+. Robert asked if scaling up quickly enough to meet Disney+’s extreme growth has been a challenge. Eric replies that scale is built by having multiple routes to markets using multiple CDNs so the main challenge is making sure they can quickly move to the next new market as they are announced. Before launching, they do a lot of research to work out which bitrates are likely to be streamed and on what devices for the market and will consider offering ABR ladders to match. They work with ISPs and CDNs using Open Caching. Eric has spoken previously about open caching which is a specification from the Streaming Video Alliance to standardise the API between for CDNs and ISPs. Disney+ uses 7-8 different provers currently and never rely on only one method to get content to the CDN. Eric and his team have built their own equipment to manage cache fill.

Adriaan serves the MENA market and whilst the gulf is fairly easy to address, north Africa is very difficult as internet bandwidths are low and telcos don’t peer except in Marseille. Adriaan feels that streaming in Europe and North America as ‘a commodity’ as, relatively, it’s so much easier compared to north Africa. They have had to build their own CDN to reach their markets but because they are not in competition with the telcos, unlike CDNs, they find it relatively easy to strike the deals needed for the CDN. Shahid has a very large library so getting assets in the right place can be difficult. They see an irony that their AVOD services are very popular and get many hits for a lot of the popular content meaning it is well cached. Their SVOD content has a very long tail meaning that despite viewers paying for the service, they risk getting a worse service because most of the content isn’t being cached.

Jorge presents his view as both a streaming provider, Movistar, and a telco, Telefonica which services Spain and South America. With over 100 POPs, Telefonica provides a lot of IPTV infrastructure for streaming but also over the internet. They have their own CDN, TCDN, which delivers most of their traffic, bursting to commercial CDNs when necessary. Telefonica also supports Open Caching.

Eric explains that the benefit of Open Caching is that, because certain markets are hard to reach, you’re going to need a variety of approaches to get to these markets. This means you’ll have a lot of different companies involved but to have stability in your platform you need to be interfacing with them in the same way. With Open Caching, one command for purge can be sent to everyone at once. For Adriaan, this is “almost like a dream” as he has 6 different dashboards and is living through the antithesis of Open Caching. He says it can be very difficult to track the different failovers on the CDNs and react.

Gonzalo points out how far CDNs like Fastly have come. Recently they had 48 hours’ notice to enable resources for 1-million concurrent views which is the same size as the whole of the Fastly CDN some years go. Fastly are happy to be part of customers’ multi-CDN solutions and when their customers do live video, Fastly recommend that they have more than one simply for protection against major problems. Thinking about live video, Eric says that everything at Disney+ is designed ‘live first’ because if it works for live, it will work for VoD.

The panel finishes by answering questions from the audience.

Watch now!
Free registration required


Eric Klein Eric Klein
Director, Media Distribution, CDN Technology,
Jorge Hernandez Jorge Hernandez
Head of CDN Development and Deployment,
Adriaan Bloem Adriaan Bloem
Head of Infrastructure,
Gonzalo de la Vega Gonzalo de la Vega
VP Strategic Projects,
Robert Ambrose Robert Ambrose
Co-Founder and Research Director,
Caretta Research

Video: Transforming the Distribution and Economics of Internet Video

Replacing CDNs in streaming would need a fundamental change in the way we store and access video on the internet, but this is just what Eluvio’s technology offers along with in-built authentication, authorisation and DRM. There’s a lot to unpack about this distributed ‘content fabric’ built on an Ethereum-protocol blockchain.

Fortunately, Eluvio co-founder Michelle Munson is here to explain how this de-centralised technology improves on the status quo and show us what it’s being used for. We know that today’s streaming technology is based on the idea of preparing, packaging, transcoding and pushing data out through CDNs to views at home and whilst this works, it doesn’t necessarily consistent, low delay and, as we saw from Netflix and Facebook reducing their streaming bitrates at the beginning of the pandemic, it can be quite a burden on networks.

This content fabric, Michelle explains, is a different approach to the topic where video is stored natively over the internet creating a ‘software substrate’. The result doesn’t use traditional transcoding services, CDNs and databased. Rather we end up w ith a decentralised data distribution and storage protocol delivering just-in time packaging. The content fabric is split into four layers, one of which deals with metadata, another contains code which controls the transformation and delivery of media. The third layer is the ‘contract’ layer which controls access and proves content with finally a layer for the media itself. This contract layer is based on the Ethereum technology which runs the cryptocurrency of the same name. The fabric is a ledger with the content being versioned within the ledger history.

Michelle points out that with blockchain contracts baked in to all the media data, there is inherently access control at all parts of the network which has the property that viewers only need to have an ethereum-style ‘ticket’ to watch content directly. Their access is view-only and whilst this passes through the data and code layers, there is no extra infrastructure to build on top of your streaming infrastructure and each person can have their own individually-watermarked version as delivered with Eluvio’s work with MGM’s online premier of the recent Bill and Ted film.

Eluvium currently have a group of globally-deployed hubs in internet exchange sites which operate the fabric and contain media shards and blobs of code which can operate on the media to provide just-in-time delvery as necessary with the ability to create slices and overlays inherent in the delivery mechanism. When a player wants access to video, it issues the request with its authorisation information. This meets the fabric which responds to drive the output. Because of the layer of code, the inputs and outputs of the system are industry standard with manipulation done internally.

Before finishing by talking about the technology’s use within MGM and other customers, Michelle summarises the capabilities by saying that it simplifies workflows and can deliver a consistently low, global time to first byte with VoD and Live workflows interchangable. Whilst Michelle asserts that previous distribution protocols have failed at scale, Eluvio’s fabric can scale without the significant burdens of file IO.

Watch now!

Michelle Munson Michelle Munson
CEO and Founder,

Video: The Future of Online Video

There are few people who should build their own CDN, contends Steve Miller-Jones from Limelight Networks. If you want to send a parcel, you use a parcel delivery service. So if you want to stream video, use a content delivery network tuned for video. This video looks at the benefits of using CDNs.

John Porterfield welcomes Steve to YouTube channel JP’sChalkTalks and starting with a basic outline of CDNs. Steve explains that the aim of the CDN is to re-deliver the same content as many times as possible by itself without having to go back to a central store, or even back to the publisher to get the video chunk that’s been requested. If your player is a few seconds behind someone else’s who lives in the same geography, then the CDN should be able to deliver you those same chunks almost instantly from somewhere geographically close to you.

Steve explains that in the Limelight State of Online Video 2020 Annual Report rebuffering remains the main frustration with streaming services and, remaining at approx 44% for the last 3 years when taken as a global average. Contrary to popular belief, the older generation is more tolerant of rebuffering than younger viewers.

As well as maintaining a steady feed, low-latency is remaining important. Limelight is able to deliver CMAF down to around a 3-second latency or WebRTC with sub-second latency. To go along with this sub-second video streaming, Limelight also offer sub-second data sharing between players which Steve explains is a important feature allowing services to develop interactivity, quizzes, community engagement and many other business cases.

Lastly Steve outlines the importance of Edge computing as a future growth area for CDNs. The first iteration of cloud computing was a success by taking computing into central locations and away from individual businesses. This worked well for many for financial reasons, because it freed organisations up from managing some aspects of their own infrastructure and enabled scaling of services. However, the logic of what happened next was always done in this one central place. If you’re in Australia and the cloud location is in the EU, then that’s a long wait until you get the result of the decision that needs to be made. Edge computing allows small parts of logic to live in the closest part of a CDN to you. This could well mean that the majority of a service’s infrastructure is in the US, but some of the CDN be it CloudFront, Limelight or another will be in Australia itself meaning pushing as much of your services as you can to the edge will result in significant improvements in speed/latency reduction.

Watch now!

Steve Miller-Jones Steve Miller-Jones
VP Strategy & Industry,
Limelight Networks
John Porterfield John Porterfield
Technology Evangelist,
JP’sChalkTalks YouTube Channel

Video: CDNs: Building a Better Video Experience

With European CDN spend estimated to reach $7bn by 2023, an increase in $1.2 in only three years, it’s clear there is no relenting in the march towards IP. In fact, that’s a guiding principle of the BBC’s transmission strategy as we hear from this panel which brings together three broadcasters, beIN, Globo and the BBC to discuss how they’re using CDNs at the moment and their priorities for the future.

Carlos Octavio introduces Globo’s massive scale of programming for Brazil and Latin America. Producing 26,000 hours of content annually, they aim to differentiate themselves as much with the technology of their offerings as with the content. This thirst for differentiation drives their CDN strategy. Brazil is a massive country, so covering the footprint is hard. Octavio explains that they have created their own CDN to support Globo Play which is based on 4 tiers from their two super PoPs in Rio and Sao Paolo down to edge caches. Octavio shows that they are able to achieve the same response times as the major CDN companies in the region. For overflow capacity, Globo uses a multi-CDN approach.

Bhavesh Patel talks about the sports and news output of beIN, both of these being bursty in nature. Whilst traffic for sporting events can forecast, with news this is often not possible. This, plus the wide variability of customers’ home bandwidth are drivers in choosing which CDNs to partner with. Over the next twelve months, Bhavesh explains, beIN’s focus will move to bring down latency on their system as a whole, not on a service by service level. They are also expecting to continue to modify their ABR ladders to follow viewers as they continue their shift from second screens to 60 inch TVs.

The BBC’s approach to distribution is explained by Paul Tweedy. Whilst the BBC is still well known as a linear, public broadcaster, it has been using online distribution for 25 years and continues to innovate in that space. Two important aspects to their strategy are being on as many devices as practical and ensuring the quality of the online experience meets or is comparable to the linear services. The BBC has been using multiple CDNs for many years now. What changes is the balance and what they use CDNs for. They cover a lot of sports, explains Paul, which leads to short-term scaling difficulties, but long term scaling difficulties are equally on his mind due to what the BBC calls the ‘glide path to IP’. This is the acknowledgement that, at some point, it won’t be financially viable to run transmitters and IP will be the wise way to use the licence fee on which the BBC depends. Doing this, clearly, will demand IP delivery of many times what is currently being used. Yesterday’s article on multicast ABR is one way in which this may be mitigated and fits into a multi-CDN strategy.

Watch now! Free registration

Looking at today’s streaming services, Paul and his colleagues aim to get analytics from every player on every device wherever possible. Big data techniques are used to understand these logs along with server-side, client-to-edge and edge-to-origin logs. This information along with sports schedules can lead to capacity planning, though many news events are much less easy to plan. It’s these unplanned, high-peak events which drive the BBC’s build up of internal monitoring tools to help them understand what is working well under load and what’s starting to feel the strain so they can take action to ensure quality is maintained even through these times of intense interest. The BBC manage their capacity with their own CDN, called BIDI, which provides for the baseline needs and allows an easier-to-forecast budget. Mulitple, third-party CDNs are, then, the key to providing the variable and peak capacities needed.

As we head into the Q&A Limelight’s Steve Miller-Jones outlines the company’s strengths including their focus on adding abilities on top of a ‘typical’ CDN. For instance, running applications on the CDN which is particularly useful as part of edge compute and their ability to run WebRTC at scale which not many CDNs are built to do. The Q&A sees the broadcasters outlining what they particularly look out for in a CDN and how they leverage AI. Globo anticipate using AI to help them predict traffic demand, beIN see it providing automated highlights whilst the BBC see it enabling easier access to their deep archives.

Watch now!
Free registration

Carlos Octavio Carlos Octavio
Head of Architecture and Analytics,
Bhavesh Patel Bhavesh Patel
Global Digital Director,
Paul Tweedy Paul Tweedy
Lead Architect, Online Technology Group,
BBC Design + Engineering
Steve Miller-Jones Steve Miller-Jones
Vice President of Product Strategy,
Limelight Networks