The webscale effect: 6 big shifts in computing, courtesy of the world’s largest web companies

In less than a month, Gigaom will host its seventh annual Structure conference examining the future of computing. Looking back to the first event in 2008, it’s stark how much has changed. The concepts of the day such as software as a service, cloud computing and big data have been validated and assimilated. The new discussions are around ever-evolving techniques and architectures for consuming them, and how to deliver them at ever-greater scales.

We owe this fast maturation, in large part, to the rise of companies such as Google(s goog), Facebook(s fb), Amazon(s amzn) and Twitter(s twtr), which have grown immensely since then and have had to address the problems that come with that size — in terms of users, data, and new types of devices and networks. To get a sense of this evolution, watch this video from 2009, featuring a panel of operations executives from sites including Facebook, Google, Microsoft and, um, MySpace.


This year’s event will feature Urs Hölzle (Google), Jay Parikh (Facebook), Werner Vogels (Amazon), Raffi Krikorian (Twitter), and more talking about their latest challenges and how they’re solving them. But first, here’s a look at six innovations those companies and their peers have already inspired and that are now making their way into the mainstream.

Cloud architectures and APIs

Service-oriented architectures and APIs aren’t new ideas, but they’ve been infused with new blood thanks to web companies and cloud computing. In the name of automation, flexibility and scalability, web companies like Facebook and Twitter have mastered the art of building platforms comprised of loosely coupled services. Startups and developers can access pretty much anything — from artificial intelligence to databases — as cloud services, and Netflix(s nflx) has shown what’s possible when you push Amazon Web Services to its limits.

Structure 2011: Werner Vogels – CTO,

Amazon CTO Werner Vogels at Structure 2011. (c) Pinar Ozger

New data architectures

The big data discussion has moved quickly from one about scale and unstructured data to one about building multi-tiered data architectures that assume scale and unstructured data. Web companies developed stream-processing engines and NoSQL data stores that are now prevalent and able to take the load off of relational databases and batch engines such as MapReduce. Now, pretty much anybody willing to learn the technology can get the right latency for the right data rather taking a one-size-fits-all approach.

Ryan Waite, Amazon Web Services, Gigaom Structure Data 2014

Amazon Web Services’ Ryan Waite discussing stream processing at Structure Data 2014.

The programmable data center

Cloud computing and virtualization changed the way companies think about their servers, but the web has taken that level of automation to a new level so they can manage many thousands of servers with relatively small teams. They’ve created software to turn entire data centers into big, easily easily deployable and re-deployable pools of compute and storage. The latest battleground has been in networking, where companies like Google helped spur along software-defined networking and the idea that switches can be as dumb as boxes.

Structure 2012: Martin Casado - Co-Founder and CTO, Nicira, Jonathan Heiliger - General Partner, North Bridge Venture Partners, Dante Malagrinò - Co-Founder and CEO, Embrane

Dante Malagrinò (Embrane), Jonathan Heiliger (North Bridge Venture Partners / ex-Facebook) and Martin Casado (VMware / then-Nicira) talking SDN at Structure 2012. (c) Pinar Ozger

The clean data center

More a quest to drive down their bottom lines than to please Greenpeace, web companies have shown what’s possible when data centers are built with energy efficiency as a top priority. Facebook and eBay(s ebay) have been particularly open about their data center strategies, sharing design plans and tools to measure everything from water usage to excess database queries to cut down on resources. Beyond building design, the web is also largely behind the push to cut consumption at the chip level with low-power ARM servers.

eBay data center VP Dean Nelson - VP, Global Foundation Services, eBay

eBay data center VP Dean Nelson at Structure 2013. (c) Pinar Ozger

The open source explosion

The more tools web companies and startups build to deal with their unique scalability and performance issues, the more tools that are released as open source. This is partly to improve the technologies, and partly to help recruit engineering talent that likes working on open-source stuff. The results include popular databases, programming tools and even operating systems. In the past few years, the open-source movement has moved to hardware, with designs for servers, storage, switches and even CDN appliances.

Frank Frankovsky Facebook Structure:Europe 2013

Former Facebook VP Frank Frankovsky talking open source hardware at Structure:Europe 2013. (c) Anna Gordon/Gigaom

Programming for a mobile world

The rise of smartphones and tablets has changed more UI design. Cutting-edge companies are figuring out ways to build apps that minimize data usage and balance computing loads across the network. They’re accounting for security concerns that arise when the network has been blown apart by the number of devices accessing it and the infinite number of access points. And they’re realizing that the major market for all this might be billions of people in emerging markets where smartphones and mobile networks are the web.

Speakers: Satya Nadella - President, Server and Tools Business, Microsoft

Microsoft CEO Satya Nadella talking cloud, web and devices at Structure 2013. (c) Pinar Ozger