How the core of the internet has changed from data to content

A fundamental shift from data to content has actually changed the infrastructure of the internet–and the change isn’t over yet. That’s the view of Craig Labovitz, cofounder of DeepField, speaking on Wednesday at GigaOM’s Structure:Europe conference. And if anyone should know about the “plumbing” of the internet’s core, it’s Labovitz; he’s studied it for more than 15 years.

Through 2008, Labovitz says, the “core” of the internet meant 10 to 15 companies handling the long haul backbones. They incurred most of the cost for the core at that time, but provided most of the value and delivered the majority of internet data. That started to change in 2006, however, thanks to an explosion of content. “By 2010 Google went from 1 percent of traffic to 6 percent due to the YouTube purchase,” he noted, requiring changes to the core: Content shifted to data centers, which required massive investments.

What was the impact? In 2009, about 150 companies accounted for 50 percent or more of internet traffic. By 2013, just 50 companies deliver more than half of all internet traffic, and they do so using a smaller number of data centers and Content Delivery Networks. There’s now an incredible density of data coming and going through the core. Need an example? Netflix(s nflx) is it, said Labovitz, accounting for 35 percent (and growing) of U.S. web traffic.

Sure there’s plenty of data still racing through the pipes of the internet’s plumbing. But a specific type of data — content — is what’s changing how those pipes are arranged as content providers are looking to get a more localized presence closer to your home network.

Check out the rest of our Structure:Europe 2013 coverage here, and a video embed of the session follows below:

[protected-iframe id=”9bc75c8ac3e57e4bc92376d83cb68b98-14960843-25766478″ info=”″ width=”640″ height=”360″ frameborder=”0″ scrolling=”no”]