Do you have any idea what your incessant status updates require Facebook to do on the back end? Supporting 100 million photo uploads each day and as many as 18,000 comments
Those photos and status updates, not to mention videos and emails and chats, prompted the social network to spend the last two years developing a specialized data center and server design that it shared today. The company wrote in its press release that in 2009 it realized it needed its own data center where it could experiment with cheaper servers, more energy efficient designs and other engineering tweaks that would help it contain its costs and also reduce the environmental impact of its operations.
Back in early 2009, we came to a crossroads for our infrastructure. At that point, we were leasing space in data centers designed for general purpose computing — meaning we were getting average efficiency at average cost. Facebook was growing rapidly — the site had 150 million active users at the beginning of 2009, a number that increased to more than 350 million by the end of that year.
We didn’t have hundreds of millions of dollars to throw at the problem. We experimented with optimizing our existing leased facilities, but concluded that we wanted to build a data center of our own.
So this is what webscale looks like, and thanks to Facebook sharing its innovations with the world, other companies can take advantage of lower building and operating costs for their own webscale data centers. And eventually, improve upon them.