Updated: As talk of cloud adoption by enterprises becomes more commonplace, focus is shifting from justifying the cloud to identifying best practices and the highest performing cloud providers. One area with significant activity is performance monitoring as Geva Perry points out on his blog. A report published at the end of last year by IDC found that respondents ranked performance behind only security and availability as the biggest challenges and issues for cloud adoption.
In order to give buyers visibility into the relative performance levels of different cloud providers, a number of groups have developed tools to measure and compare performance under different scenarios. While individual vendors have begun to provide their own monitoring dashboards (Salesforce (s crm) has its Trust Dashboard while Amazon (s amzn) has its CloudWatch dashboard, for example), buyers are increasingly looking for third-party tools that give a neutral insight into vendor performance.
Some interesting players in the space include:
Supporting Rackspace (s RAX), Amazon EC2, Linode, GoGrid, Slicehost, RimuHosting, and VPS.NET, CloudKick enables users to control cloud infrastructure from multiple different vendors from one dashboard which allows both monitoring and management of an infrastructure setup.
Built upon the open-source Libcloud API that CloudKick also helped develop under the guidance of the Apache Software Foundation Incubator, CloudKick’s commercial services range from $99 to $599/month depending on the number of servers. Cloudkick also offers customized packages for customers with larger or more specific needs.
Still in beta, CloudSleuth is a cloud performance visualization tool that runs a benchmark Java e-commerce application to measure response time across various cloud providers.It deploys the identical application to different cloud providers and then measures performance from locations across the U.S. and internationally. CloudSleuth returns metrics about both response time and availability.
Cloudstone is a tool that spun out of a research project at UC Berkeley. More than a benchmarking service itself, it has created an open-source framework for testing cloud performance. The research team built a selection of tools for generating various load levels and testing the performance under those loads across various cloud providers.
More than a benchmarking tool, ServerDensity is a product that offers the ability to measure CPU load, memory, processes, disk usage, network traffic. Using a generic application or workload bundle, however, one could use ServerDensity’s reporting engine to run comparative tests across multiple cloud providers.
CloudCMP was developed jointly by Duke University and Microsoft (s msft) Research. The Cloud CMP project developed a number of measurement areas and assessed different cloud vendors against those tests. In doing so, it has produced both empirical bottom line performance results as well as an interesting cost/performance measure for different applications deployed across different providers. So far, the research team has produced a single report covering several cloud providers.
CloudStatus is a VMware (s vmw) company that aims to “provide an independent view of the health and performance of the most popular cloud services on the web.” Currently in beta, CloudStatus thus far measures only Amazon Web Services and Google App Engine (s goog) for both availability and performance.
BitCurrent performed a benchmarking study in late June. The study measured Amazon, Rackspace and Terremark’s (s tmrk) IaaS offerings and force.com and Google App Engine at he platform level. As a one-time only study, it’s primarily of academic interest, as cloud performance is a changing area and near real time status is the only real way to monitor performance on an ongoing basis.
Doubt remains as to how cloud performance offerings will fulfill the dual needs of remaining independent to ensure neutrality while still building a viable and profitable business. The existence of several University-created tools in this list indicates that it may well be from the research perspective that most of the independent performance metrics come.
Ben Kepes is an independent consultant and contributing writer for GigaOM. Please see his disclosure statement in his bio.
Related GigaOM Pro content (sub req’d): Infrastructure Overview, Q2 2010