The challenges of shaping the future of the cloud

Enterprise IT departments struggle when it comes to selecting the best cloud solutions for their needs. All too often, those departments want to mix and match components from different vendors, but the current poor state of interoperability in the cloud market makes this challenging. The Open Data Center Alliance (ODCA) hopes to change that.

This week the consortium, which includes BMW, Lockheed Martin and AT&T, released a wide-ranging set of eight use cases that describe the components required in various enterprise cloud deployments. The aim? Make it easier to compare and contrast commercial solutions and increase the level of technical interoperability between clouds. The hope is that data and services can move from one cloud provider to another with minimal difficulty, and the consortium is calling upon members to proactively adopt these use cases in their internal planning and procurement processes.

The cases, available as separate documents, cover four broad themes:

  • Secure federation: Two documents offer a model (PDF) for a set of security requirements and propose a classification system (PDF) in which cloud products from virtual machines to network infrastructure can be categorized as achieving platinum, gold, silver or bronze levels of security protection.
  • Automation: The alliance proposes mechanisms for policy-based management (PDF) to control the use of network resources by different virtual machines, along with a model for standardizing the tools (PDF) used to control the virtual machines offered by different cloud providers.
  • Common management and policy: The alliance offers a document (PDF) that plays the potentially invaluable role of beginning to identify some of the major regulators (such as the SEC in the U.S.) affecting different industries and geographies. The document does not yet go as far as actually mapping the regulations and their implications, and this work needs to take place before the real value becomes apparent.
  • Transparency: Reporting and provision of metrics are increasingly important, both to enterprise customers and to the wider world. Three alliance use cases provide models for consistent reporting and comparison of carbon footprints (PDF), the cost and capability of individual IaaS offerings and the description of a cloud provider’s service catalog (PDF).

The alliance suggests that support for its work will generate billions in new business for the cloud computing industry and will save as much as $25 billion in customers’ deployment costs. Industry players such as EMC and Hitachi Data Systems are listed in the lowest tier of involvement as “adopter members,” though major cloud providers such as Amazon and Rackspace do not appear to have been involved in drafting or commenting upon these documents.

Nevertheless, the alliance rather aggressively “expects” the cloud-computing industry to respond to its documents within six months by providing new technical roadmaps and supporting new deployments. To what extent cloud providers will be willing or able to meet all of the alliance’s demands remains unclear; there’s been little public comment from the companies such as HP, IBM and Rackspace, with whom the Alliance really needs to engage.

By seeking to describe mainstream use cases, the alliance has taken a potentially valid approach to harmonizing the cloud computing market. However, many of these use cases remain weak and lacking in detail. The Regulatory Framework activity, for example, is currently little more than a list of regulators, and it’s a long way from being a viable use case that a vendor or buyer could use to make decisions.

The alliance is not the first to recognize the need for greater interoperability in the cloud market. NISO and the IEEE are active in this space, and the Data Center Management Taskforce has a Cloud Management Standards activity. And while there’s no denying that such attempts at standardization are a positive thing, as they help both buyers and sellers of cloud services, too many “standards” runs the risk of no standard at all.

To combat this, the alliance should concentrate on enriching its use cases and then submitting them to the standards bodies and regulators already active in this space. These use cases have the potential to be a valuable addition to those activities. They lack the detail to justifiably be used as sticks with which to beat “noncompliant” vendors.

Question of the week

What can the ODCA and others do to successfully standardize the cloud market?

Rapid Progress vs. Sustainable Growth: Finding Balance in the Cloud

Last week’s Open Compute project announcement sparked lots of conversation, much of which still echos around the server and data center communities. At first glance, a tiny team inside Facebook appears to have upset the data center status quo with the project, and done so in a remarkably short period of time (according to its website, the project was started just a little over a year ago). But what does that mean for far slower initiatives, such as those developing the industry standards and specifications I wrote about last week?
Speaking in a video, Facebook Hardware Design Director Frank Frankovsky said: “Eighteen months [is] the normal length of time to do a custom power supply project,” but in that same amount of time, the team at Facebook specified, designed, procured and built a server, then deployed the design throughout a data center.
By comparison, National Institute of Standards and Technology (NIST) cloud executive Dawn Leaf last week announced that the U.S. government agency expects to only publish a first draft of its Cloud Computing Technology Roadmap within the current fiscal year. And the IEEE-Standards Association‘s Cloud Computing working groups will move at a similar pace, according to IEEE Chair, David Bernstein. Two new working groups were announced earlier this month, and members are currently being sought; Bernstein estimates that hundreds of individuals and organizations may receive a seat at the table. Objectives will only be defined in the group once its membership is agreed, and Bernstein currently expects “detailed definition of outputs” later this year. The first concrete deliverables are expected to appear in 2012.
All this raises the following question: When small teams consistently progress from concept to real-world execution faster than traditional, standards-making groups can even deliver early drafts of discussion documents, should the consensus-based process of building standards change? Amazon, after all, didn’t form an industry-spanning working group to design the solution that became Amazon Web Services (AWS); the company just went and did it. Because the solution was useful and affordable, it in effect became a de facto standard for others (such as Eucalyptus) to emulate and interoperate with. Google researchers didn’t seek buy-in for BigTable, and Doug Cutting showed no inclination to workshop the design of Hadoop outside his employer at the time, Yahoo. Even at NASA, where one might expect the culture to embrace a wide-ranging process of consensus building, an internal requirement for cloud computing instead led to the procurement of development services from Anso Labs. In each case, internal solutions to internal problems ended up being useful enough for others to take, learn from and build upon.
Bernstein welcomes company-specific efforts like Open Compute; but while the project is “is great work,” he stresses that “it is not a standard.” “How,” he asks, “could I procure one? How could I require one [in an Invitation to Tender]?” And it’s here, perhaps, that the slow-but-steady process of agreeing standards still has an important part to play. Individuals and small teams in Facebook, Google, Amazon and their equivalents might have the creative spark and drive to create remarkable innovations, but wider deployment — especially the mission-critical type in governments, regulated industries and large enterprises — requires more than the momentary enthusiasm of a bright developer or the fleeting alignment of corporate interest to a particular development project. Real, lasting, deployment still seems to require the standards-making process and its painstaking attempt to ensure that every eventuality is considered, every conflict resolved, every doubt clarified, every use case documented.
Open Compute is just one example of technological innovation, generously open-sourced and shared with the wider community. The standards-making process follows more slowly behind, and will eventually capture the essence of these innovations in a form that others can trust, depend upon and refer to for years to come — long after the Open Compute team at Facebook has dispersed in search of new challenges and opportunities.

Question of the week

Do you wait for formal standards before deploying new technologies in your business?

Will Standardizing the Cloud Cause Clarity or Confusion?

The European Commission (EC), the Institute of Electrical & Electronics Engineers (IEEE) and the U.S. National Institute of Standards & Technology (NIST) have taken interest in the cloud for some time. Recent announcements suggest that each wants to make clear its own individual view on the cloud’s potential and how it should be regulated. Each institution also has some interest in standardizing parts of the cloud. But will three sets of priorities and interests underway at the same time leave us with clarity or confusion?
InfoWorld reports this week that the IEEE, which has developed standards for everything from high-speed data transfer to online learning, has chartered two new working groups to look more closely at the cloud. Working Group 2301 addresses existing profiles of cloud-computing standards, enabling “standards-based choices in areas such as application interfaces, portability interfaces, management interfaces, interoperability interfaces, file formats and operation conventions.” The related Working Group 2302 “defines topology, functions and governance for cloud-to-cloud interoperability and federation.” The latter group’s activities have more potential to get bogged down in politics, since its activities aren’t just technological considerations, and address issues like registration, trust and compliance.
The NIST attracted positive attention way back in 2009 with its attempt to define a “working definition of cloud computing.” Now on Version 15 (Microsoft Word document), that definition has been joined by a program of collaborative activities with which interested agencies, organizations and companies can engage. A meeting in Gaithersburg, Md. later this week will review promising deliverables such as a reference architecture (PDF) for cloud computing and a taxonomy (PDF) to clearly describe the roles and relationships involved in providing and consuming cloud services. Such deliverables could be important in enabling everyone involved in the cloud market to unambiguously describe what they want to procure or provide, and why.
In Europe, meanwhile, the EC is moving toward publication of a European cloud-computing strategy. A meeting in Brussels scheduled for May 23 will be the culmination of an online consultation process that is due to begin later this month. Europe continues to emphasize the importance of privacy and data protection, sometimes seen — unfairly — as being at odds with cloud computing. The European Commission’s Vice President, Neelie Kroes, reiterated this week 10 principles underpinning ICT’s role “at the heart of both economic and social progress,” the majority of which could be considered to actively endorse use of a cloud-based approach in enabling that progress.
Individually, each of the above initiatives is taking useful steps to clarify language and practice with respect to the cloud; each seeks to bring some stability to a market where definitions remain fluid. But together, they also face a pair of serious obstacles that could do them — and the cloud — harm:

  • Multiple efforts operating in roughly the same space with different agendas and priorities run the risk of contradicting one another and undermining market confidence in any of their findings. Only constant communication and cross-fertilization between groups can prevent this.
  • The cloud is mature enough for standardized terminology and understanding of key concepts, such as those being prepared by NIST. All three  of the above organizations’ efforts, though, must be wary of standardizing too much, and too soon. Overly prescriptive measures constrain innovation and trap markets into ways of working that may not prove to be what they need or want. Amazon’s recent announcement of the Dedicated Instance, for example, led some to wonder if the company was still delivering a “real” cloud. This market is still evolving, and we can’t afford to be too dogmatic or prescriptive.

A large number of skilled individuals and organizations are investing significant effort in these three parallel initiatives. If we can ensure that realistic objectives are set, and that all three initiatives communicate openly and regularly with one another and with the wider community, we have a real opportunity to gain sound technical and conceptual foundations on which to build the next generation of the cloud. The alternative — interminable committee processes, power struggles between standardization efforts, and over-zealous attempts to regulate and control — is one that we must all strive to ensure does not come to pass.

Question of the week

Can we standardize cloud computing without stifling innovation and new ideas?

Today in Cleantech

Last week we delved into the world of standards for solar power systems — this week, it’s time for energy storage to get the standards treatment. The IEEE has formed a working group in its P2030 smart grid interoperability group, aimed at giving makers of energy storage systems some guidelines for how they should communicate and interact with the grid. As batteries become less expensive, they could be deployed in a smaller-scale distributed fashion to  help firm up local grids. At the same time, alternate storage systems such as flywheels or thermal energy storage (i.e. ice-making air conditioning) are likely to be integrated with batteries, inverters and ultracapacitors in ways that could challenge traditional grid controls.

Today in Cleantech

Let’s call this smart grid wonk Friday. After more than a year of working on hundreds of smart grid standards, the National Institute for Standards and Technology has released five “foundational” sets of standards for federal and state regulators. But the five International Electrotechnical Commission (IEC) standards in question don’t hit such hot topics as whether Internet protocol (IP) will be required in smart meters, or whether ZigBee will be favored for home energy networks. Rather, they deal with big utility systems, including data exchange between different utility control stations, transmission and distribution systems and substation automation systems — as well as the critical cybersecurity aspect of how all those systems will interact. As for real-world application of the standards, the Federal Energy Regulatory Commission has started a rulemaking process on how it could encourage, or perhaps force, compliance by utilities and their smart grid vendors. Stay tuned for more details — though, with the speed these things usually unfold, don’t hold your breath.