It’s neither easy nor glamorous — data scientists get all the love — but making sure your Hadoop cluster is properly configured and applications are running optimally is necessary, especially as applications move into production. Here are five tools to help you do it.
The next release of Abiquo’s enterprise cloud management software will integrate with the Chef configuration management tool and sport a customizable user interface. Before moving workloads to the cloud, businesses want the sort of management tools they’re using now in their own data centers.
IBM on Tuesday acquired Platform Computing, a company that made a name for itself in high-performance computing but recently made a splash in the cloud computing and big data spaces. It’s likely these areas that had IBM in a buying mood.
The face of high-performance computing is changing. That means new technologies and new names, but also familiar names in new places. Anyone that doesn’t have a cloud computing story to tell, possibly a big data one too, might starting looking really old really quickly.
Hadoop is a very valuable tool, but it’s far from perfect. While Apache, Cloudera, EMC, MapR and Yahoo focus on core architectural issues, there is a group of vendors trying to make Hadoop a more-fulfilling experience by focusing on business-level concerns such as applications and utilization.
Analyst firm Forrester published an assessment report on private cloud software this week, and Platform Computing, with its ISF software, appears to have the most-complete offering based on Forrester’s criteria. For now.
High-performance computing leader Platform Computing hopes to capitalize on the big data movement by spreading its wings beyond its flagship business of managing clusters and grids and into managing MapReduce environments, too. Platform has a solid foundation among leading businesses, especially in the financial services industry.