Think Big Analytics wants to help companies make the most of Hadoop

A big-data-analytics consulting company launches Tuesday with $3 million in seed funding, under the name Think Big Analytics. Established in 2010, Think Big seeks to help companies start making the most of their data in the most cost-effective ways.
Former Cisco (s csco) executive Dan Scheinman led the round of funding for Think Big, which is based in Mountain View, Calif. WI Harper Group joined in the round, too. The company will primarily use the seed funding to add people to its data science and data engineering teams.
When Think Big first started, many customers didn’t know what big data was, said Ron Bodkin, Think Big’s CEO and a co-founder. Use cases have evolved since then, he said, and demand for big data analytics has grown.
Think Big has already worked with NetApp and Quantcast, as well as a U.S. telecommunications company. Use cases vary from early-stage adoption to quests for better efficiency. Think Big has helped a large retailer institute a big data architecture to give tailored recommendations to customers based on their input during in-store visits, phone calls and mobile interactions. Think Big guided a different retailer as it transferred from a legacy Teradata Corp. appliance to a Hadoop cluster, thereby dropping query times from six hours to four minutes.
Think Big employees have a few ways to respond to the demand. They could assist a client in prioritizing use cases before working with a Hadoop vendor such as Cloudera or Hortonworks, Bodkin said. A client could also send as many as 20 developers to a three-day hands-on Think Big course covering Hadoop, MapReduce, Hive, Pig and other topics. Courses for non-developers and executives are also available.
Other companies don’t block Think Big’s development as much as reliance on tried and true methods for analyzing data, said Bodkin, who will speak at our Structure:Data conference in March.