When Comcast and Netflix signed a peering agreement last month that lets the two networks directly connect to each other, it also meant that Netflix would be paying Comcast to deliver its online video. From Comcast’s perspective, this deal helps offset the stress it believes online video delivery at the scale of Netflix puts on its network by helping pay for upgrades necessary to deliver that video.
For others it represents a questionable abuse of monopoly power by Comcast. But outside of how much Netflix might be paying, there’s a fundamental question that this fight brought to the fore, namely: how will we deliver online video in the future? Whether it’s concerns over peering, or HBO Go crashing during the finale of True Detective, the problems of streaming video are becoming more important to consumers as they pay for over-the-top content (like Netflix) and yet, still experience sub-par streams.
One solution to this challenge is transparent caching, where operators buy software that allows them to manage and cache content inside their networks without the operator having to intervene in determining the cached content. A group of content providers, network gear providers and ISPs are evaluating transparent caching as part of an unannounced standards group seeking to improve online video delivery overall.
Video is king
This is a bigger issue than many might imagine, even though most television isn’t consumed over broadband networks. Today Netflix represents a third of web traffic during prime time, but Netflix only counts about a third of U.S. homes as customers (it has 33.4 million U.S. subscribers). Add to this that most consumers still get their television via a pay TV subscription, and it becomes clear that if networks are already struggling to deliver video today, the future will not be kind to cord cutters and companies delivering IPTV.
Or at least that what ISPs want us to think. ISPs and vendors selling to them — such as Cisco, Sandvine, and others — spend a lot of time discussing the threat that online video poses to their networks. Cynics might argue that this fear is manufactured as a way to implement bandwidth caps and justify efforts to raise the price of bandwidth (perhaps to help offset the loss of video as part of the beloved triple play package), but online video does represent a huge change in how TV is delivered.
Getting television on demand requires a one-to-one stream, as opposed to broadcast or old cable TV platforms where the transmission was one-to-many. The old way was far more efficient, but it also meant that people had to watch what was on when it was on. Television executives are slowly coming to the realization that linear television just isn’t going to cut it for their subscribers anymore. That’s why Cisco predicts that video will comprise about 80 to 90 percent of all online traffic by 2017.
So if video is king, can cache keep up?
Given the challenges of providing a one-to-one stream over current network resources, and the conflict of interests that exist between the ISPs that offer pay TV services to consumers for a fee, and the over-the-top providers that can compete with the ISP, what can be done? For more than a decade content delivery networks have offered caching services, where they store popular web content on servers in the ISP network or near the end users, but even this is breaking down.
Big content companies don’t want to pay a middle man to deliver their content, when they are delivering so much of it. That’s one reason for the Netflix and Comcast showdown. But vendors from Cisco to startups like PeerApp and Qwilt are experimenting with an alternative called transparent caching. Unlike traditional caches where the content lives outside the operator network and is managed by a more manual process of purging and refreshing caches at different intervals, transparent caching is deployed inside the ISP’s network and uses software to understand what content needs to be cached without human intervention.
Transparent caching relies on software that tracks the videos as they move over an ISP’s pipes and caches copies of popular content, so a user request doesn’t have to go all the way back over the across the network: chopping the distance a packet travels cuts down on cost and latency. Caching lowers costs for operators and can also cut down on costs for the content providers because they no longer need to have as large a contract with the middle-mile companies, such as Level 3 or XO Communications.
Here’s an excellent explainer of transparent caching from Dan Rayburn and a simple image below from PeerApp showing the technology.
Companies like Peer App are offering ISPs software that lets them build transparent caching into their network. Cisco also has a product, as does Qwilt, a company that has signed deals with Mediacom and a few other large ISPs. Each provider differs in where its caching software might sit and how it might handle content. Alon Maor, the CEO of Qwilt, explained that his startup and others are working on a standard for transparent caching with the IEEE that could make the technology an open alternative to existing caches.
I’ve discussed this idea with ISPs and they are interested, as are some of the content providers. Yet, the problems in delivering online video using the internet is not one that will be solved solely with transparent caching. Sources tell me that HBO, Viacom, Comcast, Cisco, Limelight and others are trying to create a standards group to solve the problems of delivering online video with working groups focused on a number of different areas. So far, this effort is secretive and no one wants to go on the record to discuss it but the idea is to solve problems with online video delivery. Those problems might include digital rights management, network neutrality and technical challenges.
Transparent caching is one solution the group is considering, and the goal is to trial and demonstrate some form of transparent caching standard in the second half of this year. It’s unclear if anything will come of it, especially given that the two largest over the top providers in the country — Netflix and Google — so far are not involved in discussions.
But given the challenges of delivering video, something may have to give. And while direct peering arrangements are one solution, those content providers still have to send their bits over the network to their own caches or to the end user — a set of streams that will only increase the load on the network as time goes on. It’s clear that having more content stored at the edge is the way to go — be it in Netflix or Google-specific caching servers or via some kind of transparent caching.
With these challenges becoming evident outside of the networking community, it’s quite possible, that after years of debate and fear mongering about the threat video poses to the network, we might actually see something done in 2014.
Featured image from Thinkstock/Oleksiy Mark