Report: The importance of benchmarking clouds

Our library of 1700 research reports is available only to our subscribers. We occasionally release ones for our larger audience to benefit from. This is one such report. If you would like access to our entire library, please subscribe here. Subscribers will have access to our 2017 editorial calendar, archived reports and video coverage from our 2016 and 2017 events.
Windowed City Skyscraper Architecture Beneath Cloudscape in Black and White
The importance of benchmarking clouds by Paul Miller:
For most businesses, the debate about whether to embrace the cloud is over. It is now a question of tactics — how, when, and what kind? Cloud computing increasingly forms an integral part of enterprise IT strategy, but the wide variation in enterprise requirements ensures plenty of scope for very different cloud services to coexist.
Today’s enterprise cloud deployments will typically be hybridized, with applications and workloads running in a mix of different cloud environments. The rationale for those deployment decisions is based on a number of different considerations, including geography, certification, service level agreements, price, and performance.
To read the full report, click here.

Google’s Spanner: A database that knows what time it is

Google, which is notoriously secretive about technology advances, has opened up the vault and spit out a research paper on its Spanner database. And like other Google innovations, this one is hot. It’s a database that scales to millions of machines and trillions of rows.

Rapid Progress vs. Sustainable Growth: Finding Balance in the Cloud

Last week’s Open Compute project announcement sparked lots of conversation, much of which still echos around the server and data center communities. At first glance, a tiny team inside Facebook appears to have upset the data center status quo with the project, and done so in a remarkably short period of time (according to its website, the project was started just a little over a year ago). But what does that mean for far slower initiatives, such as those developing the industry standards and specifications I wrote about last week?
Speaking in a video, Facebook Hardware Design Director Frank Frankovsky said: “Eighteen months [is] the normal length of time to do a custom power supply project,” but in that same amount of time, the team at Facebook specified, designed, procured and built a server, then deployed the design throughout a data center.
By comparison, National Institute of Standards and Technology (NIST) cloud executive Dawn Leaf last week announced that the U.S. government agency expects to only publish a first draft of its Cloud Computing Technology Roadmap within the current fiscal year. And the IEEE-Standards Association‘s Cloud Computing working groups will move at a similar pace, according to IEEE Chair, David Bernstein. Two new working groups were announced earlier this month, and members are currently being sought; Bernstein estimates that hundreds of individuals and organizations may receive a seat at the table. Objectives will only be defined in the group once its membership is agreed, and Bernstein currently expects “detailed definition of outputs” later this year. The first concrete deliverables are expected to appear in 2012.
All this raises the following question: When small teams consistently progress from concept to real-world execution faster than traditional, standards-making groups can even deliver early drafts of discussion documents, should the consensus-based process of building standards change? Amazon, after all, didn’t form an industry-spanning working group to design the solution that became Amazon Web Services (AWS); the company just went and did it. Because the solution was useful and affordable, it in effect became a de facto standard for others (such as Eucalyptus) to emulate and interoperate with. Google researchers didn’t seek buy-in for BigTable, and Doug Cutting showed no inclination to workshop the design of Hadoop outside his employer at the time, Yahoo. Even at NASA, where one might expect the culture to embrace a wide-ranging process of consensus building, an internal requirement for cloud computing instead led to the procurement of development services from Anso Labs. In each case, internal solutions to internal problems ended up being useful enough for others to take, learn from and build upon.
Bernstein welcomes company-specific efforts like Open Compute; but while the project is “is great work,” he stresses that “it is not a standard.” “How,” he asks, “could I procure one? How could I require one [in an Invitation to Tender]?” And it’s here, perhaps, that the slow-but-steady process of agreeing standards still has an important part to play. Individuals and small teams in Facebook, Google, Amazon and their equivalents might have the creative spark and drive to create remarkable innovations, but wider deployment — especially the mission-critical type in governments, regulated industries and large enterprises — requires more than the momentary enthusiasm of a bright developer or the fleeting alignment of corporate interest to a particular development project. Real, lasting, deployment still seems to require the standards-making process and its painstaking attempt to ensure that every eventuality is considered, every conflict resolved, every doubt clarified, every use case documented.
Open Compute is just one example of technological innovation, generously open-sourced and shared with the wider community. The standards-making process follows more slowly behind, and will eventually capture the essence of these innovations in a form that others can trust, depend upon and refer to for years to come — long after the Open Compute team at Facebook has dispersed in search of new challenges and opportunities.

Question of the week

Do you wait for formal standards before deploying new technologies in your business?