Mo’ data mo’ problems: The future of the data center

The amount of data processed by companies big and small increases every day – and data centers have a hard time keeping up. Not only is scaling the physical infrastructure costly, it also consumes vast amounts of energy. A solution was discussed at Structure:Data.

Structure 08: Making Money on the Stack

Cloud computing isn’t as nebulous as its name implies. Thanks to virtualization, one can separate the storage from the servers and the servers from the software—but it’s also about bandwidth. The primary value will be more about moving data from the hardware to the end user. To that end, Google has automated its network and is using structured metadata to track how much it costs the company to move a bit or byte from one geographic area to another, according to Vijay Gill, manager of engineering at Google.

That allows Google to charge users based not just on compute cycles but on their actual costs of moving the data around the world. Gill talked about establishing an auction model for pricing that will reflect that actual costs of moving data. The importance of bandwidth was also highlighted by Lane Patterson, chief technologist from Equinix, who said that a cloud provider that owns its own bandwidth might achieve a competitive advantage.

As cloud computing unfolds it won’t do so only in the U.S., said Dr. Jay Subrahmonia, director of advanced customer solutions for IBM, who points out that developing countries are adopting it because of the speed and flexibility cloud computing offers. Figuring out how to price and value that speed and flexibility is the next big step.