Ex-Facebookers at Interana raise $20M for mass-scale event data

Interana, a Menlo Park, California, analytics startup that officially launched in October has closed a $20 million Series B round. Index Ventures led the round, which also included new investors AME Cloud Ventures, Harris Barton and Cloudera’s Mike Olson. Interana’s existing investors — Battery Ventures, Data Collective and Fuel Capital — from which the company had previously raised $8.2 million, also participated.

The idea behind Interana is to make [company]Facebook[/company]-style event analytics, as well as a the same type of data-centric culture, available to all types of companies with lots of event data to analyze. To do this, Interana built an entire system from the ground up, from columnar storage engine to user interface.

A portion of the Interana dashboard.

A portion of the Interana dashboard.

It was founded by two former Facebook engineers, Bobby Johnson and Lior Abraham, who built some of Facebook’s most-popular internal analytics systems, and Ann Johnson. Ann, who is CEO, and Bobby, who is CTO, are married.

Since the company launched, Ann Johnson said in a recent interview, Interana has scored several new customers and received interest from very large and well-known companies. She chalks up the interest to a dearth of products addressing event analytics at scale — like years worth of it — and to data volumes that “are growing faster than many people’s intuitions even tell them.”

Ann Johnson

Ann Johnson

The hope for customers is that they query their data regularly and treat Interana’s product like they treat Google’s search engine by asking lots of small questions throughout the day, she added.

A good analytics practice, Bobby Johnson noted, is to make the interface easy enough that people don’t just look for that one big insight, but rather ask as many questions as they need to, about whatever they need to. “Anything that should be answered by data should be,” he said. That way, people can spend their time worrying about other stuff.

Bobby Johnson

Bobby Johnson

To hear more about Interana’s product, as well as the experience of starting an enterprise software company with your spouse, check out our October podcast interview with Ann Johnson.

[soundcloud url=”https://api.soundcloud.com/tracks/171307349″ params=”color=ff5500&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false” width=”100%” height=”166″ iframe=”true” /]

Hands on with Watson Analytics: Pretty useful when it’s working

Last month, [company]IBM[/company] made available the beta version of its Watson Analytics data analysis service, an offering first announced in September. It’s one of IBM’s only recent forays into anything resembling consumer software, and it’s supposed to make it easy for anyone to analyze data, relying on natural language processing (thus the Watson branding) to drive the query experience.

When the servers running Watson Analytics are working, it actually delivers on that goal.

Analytic power to the people

Because I was impressed that IBM decided to a cloud service using the freemium business model — and carrying the Watson branding, no less — I wanted to see firsthand how well Watson Analytics works. So I uploaded a CSV file including data from Crunchbase on all companies categorized as “big data,” and I got to work.

Seems like a good starting point.

watson14Choose one and get results. The little icon in the bottom left corner makes it easy to change chart type. Notice the various insights included in the bar at the top. Some are more useful than others.

watson15But which companies have raised the most money? Cloudera by a long shot.

watson18

I know Cloudera had a huge investment round in 2014. I wonder how that skews the results for 2014, so I filter it out.

watsonlast

And, voila! For what it’s worth, Cloudera also skews funding totals however you sort them — by year founded, city, month of funding, you name it.

watsonlast2

Watson analytics also includes tools for building dashboards and for predictive analysis. The latter could be particularly useful, although that might depend on the dataset. I analyzed Crunchbase data to try and determine what factors are most predictive of a company’s operating status (whether it has shut down, has been acquired or is still running), and the results were pretty obvious (if you can’t read the image, it lists “last funding” as a big predictor).

watsonpredict3

If I have one big complaint about Watson Analytics, it’s that it’s still a bit buggy — the tool to download charts as images doesn’t seem to work, for example, and I had to reload multiple pages because of server errors. I’d be pretty upset if I were using the paid version, which allows for more storage and larger files, and experienced the same issues. Adding variables to a view without starting over could be easier, too.

Regarding the cloud connection, I rather like what [company]Tableau[/company] did with its public version by pairing a locally hosted application with cloud-based storage. If you’re not going to ensure a consistent backend, it seems better to guarantee some level of performance by relying on the user’s machine.

All in all, though, Watson Analytics seems like a good start to a mass-market analytics service. The natural language aspect makes it at least as intuitive as other services I’ve used (a list that includes DataHero, Tableau Public and Google Fusion tables, among others) and it’s easy enough to run and visualize simple analyses. But Watson Analytics plays in a crowded space that includes the aforementioned products, as well as Microsoft Excel and PowerBI, and Salesforce Wave.

If IBM can work out some of the kinks and add some more business-friendly features — such as the upcoming abilities to refine datasets and connect to data sources — it could be onto something. Depending on how demand for mass-market analytics tools shapes up, there could be plenty of business to go around for everyone, or a couple companies that master the user experience could own the space.

UK hotspot startup Purple WiFi raises $5M

Purple WiFi is a wireless hotspot company that doesn’t own any hotspots. Instead it has built a virtual network of business Wi-Fi access points available to the public for free – as long as they’re willing login with their social media credentials.

Now the previously self-funded U.K. startup has raised its first outside money, announcing Wednesday a $5 million round led by former Tessco CEO Terry Leahy and the William Currie Group with participation from Juno Capital. Purple said it would use the funds to hire more staff and expand outside of the U.K.

Purple basically offers a trade-off between businesses and their customers: Cafes, stores, restaurants, hotels and even museums will give their customers free Wi-Fi access with minimal login fuss in exchange for analytics and the right to market their wares at those same customers. Businesses who sign up use their existing Wi-Fi routers, tying them to Purple’s managed service in the cloud, which handles logins and tracks customer data as well as hosts Purple’s marketing platform.

From a consumer’s point of view, when you enter a Purple business you use whatever social media credentials you choose to login into the Purple WiFi portal and then get free wireless internet access. That idea isn’t new. [company]Facebook[/company] and [company]Google[/company] are trying to build virtual Wi-Fi networks using their user IDs as the keys to unlock public hotspots and in exchange getting access to valuable consumer data.

Purple is a bit more consumer friendly on the credentials side, though, as it doesn’t tie you down to a specific social network. Right now it supports Facebook, [company]Twitter[/company], Google and Instagram as well as China’s Weibo and Russia’s VKontakte.

 

Big data needs a product like Microsoft Access

The trend toward self-service in business analytics has been good for the big data industry. But in order for the user-oriented paradigm to take deep root, the industry needs to change the way it is approaching it. The information worker and data analytics worlds need a big data product akin to Microsoft Access.

Access itself was never especially safe or appropriate for production database applications. But business users were able to use it to stretch their imaginations. By having a tool that could build working database application prototypes, users were able to take ownership of what databases could do and how they could be used. Access allowed business users to experiment with databases and implement something in a relatively short period of time. Access provided everything necessary to create a database system that could almost work in a mission-critical capacity.

While lauding a tool that facilitates incomplete success may seem absurd, such a tool is essential to getting successful systems built. Access allowed users to actualize the systems that they wanted, and those systems that passed subsequent peer review created user demand and demonstrated efficacy. This situation was no cul-de-sac, as many such systems were eventually re-implemented by specialists using more professional tools. Without Access, arguably, those systems would not have been implemented at all.

A user tool helps production tools

Access heralded the beginning of self-service data management and, perhaps ironically, it gave rise to widespread adoption of client/server databases in the enterprise. In order for Hadoop and other big data analytics technologies to see the same sort of adoption, we need a tool like Access that can serve as a catalyst, allowing business users to model concretely the kinds of big data systems that they need.

Such a product, call it “Big Access”, would connect to cloud data sources, spreadsheets, enterprise data sources, log files, and perhaps certain machine data beyond those log files. Big Access would also provide functionality for data quality, data blending, and data shaping. It would provide basic data visualization capabilities, though it would leave the fancy stuff to tools that already cover the visualization space.

Big Access would also provide predictive analytics functionality. The amount of explicit effort required to build a predictive model on existing data in Big Access would actually be quite small. Big Access would build such models transparently, in the background, such that it could offer the ability for the business user to run predictive queries on whim.

Beyond bits and pieces

We have tools that fulfill some of these capabilities already. But current products are task-driven; they have a specific purpose and are used explicitly for that purpose. Conversely, Big Access would provide functionality that business users don’t necessarily realize they need. Big Access would determine from context which analytics capabilities were required and would be most useful. It would then make those capabilities available to the user, without burdening the user with a manifest of what the necessary underlying technologies were.

Big Access could run on top of Hadoop. Big Access could run on top of Apache Spark. It could also run on top of Spark Streaming and Spark’s MLLib and even on top of Spark SQL or Hive or Pig. You get the idea. Big Access wouldn’t provide innovative big data technology. It would provide innovation in the usability of existing big data technology.

Developers need it too

Big Access would be programmable. In Java. In Python. In C#. In JavaScript. No programming would be required but custom code would be accommodated. Big Access would be query-able using SQL and could be integrated into mainstream programming environments as if it were a relational database. In fact, a Big Access database, developed by a business user, and deployed to a company server, could immediately be integrated into a line-of-business application by any enterprise developer.

Of course, the same developers could integrate their applications with Hadoop today almost as easily, but many developers don’t realize this. A simple desktop tool that deployed the database to a company server would in fact be more approachable to many developers than would a Hadoop cluster. After the Big Access database was migrated to full-fledged Hadoop, the application could be migrated to Hadoop as well. In this way, Big Access would provide an on-ramp to big data technology for business users and enterprise developers alike.

Enable individuals, win the enterprise

When users can work with products in relative privacy, a greater intimacy between those users and the products can emerge. For example, this is why so much data work gets done in Excel even when, technologically, it is not always the best tool for the job. This is also why people use search engines. And, in fact, this is why so many users have worked with Microsoft Access itself.

Big Access would provide a bridge to users. Some, including entrepreneurs and technologists, may view that as mere fit and finish. But the absence of a tool like Big Access is holding back broader success for big data technology and, ultimately, for those same entrepreneurs and technologists.

If we want data and analytics to be as essential to information workers as documents, spreadsheets, presentations, email, and search are today, then we need big data tools to be as ubiquitous, approachable, and commonplace as search engines and office suite applications. We are not there yet. We need to be there. Perhaps 2015 will be the year.

Report: Over half of the mobile devices activated Christmas week run iOS

The first thing many people do after receiving a new tablet or smartphone for Christmas is to immediately go to the app stores to download software and games. Based on those downloads, it looks like Apple had a big Christmas, according to app analytics firm Flurry, which looks at those downloads to determine which brands of smartphones and tablets were the most popular gifts.

According to Flurry’s data, 51.3 percent of mobile devices activated in the week leading up to and including Christmas were from Apple, compared to 17.7 percent from Samsung and 5.8 percent from Microsoft.

flurry christmas screenshot

Flurry provides analytics for more than 600,000 apps, so although its data isn’t quite as accurate as device activations reported by Google and Apple, it’s likely to be pretty close. Still — there’s the possibility the apps Flurry tracks are more likely to be downloaded by iOS users than Android users, for instance. It’s also important to keep in mind that although Flurry’s stats are global, December 25th is not an important gift-giving day in most parts of the world, so the data will be skewed towards Western markets.

The data suggested that large phones were significantly more popular this Christmas than in years past, most likely due to both the popularity of the iPhone 6 Plus and rising screen sizes. Last year, Flurry estimated that 4 percent of new devices had screens between 5.0 and 6.9 inches. This year, 13 percent of new devices activated on Christmas can be categorized as a “phablet,” with their growth largely coming at the expense of tablets.

The Flurry report also marks a good showing for Microsoft and its line of Lumia phones running Windows Phone, coming in third behind Apple and Samsung with 5.8 percent of total devices activated in the week leading up to Christmas. That’s significantly higher than the 2.7 percent worldwide shipment market share for Windows Phone estimated by International Data Corporation earlier this month.

Flurry estimated that app downloads are 2.5 times higher on Christmas than on an average day in December.

Twitter now lets you track your tweet’s impact on-the-go

Twitter’s analytics tool, which tells you how many people have seen your tweets, among other things, is now on iOS. When you’re out and about, if you’re dying to see whether your latest quip went viral you can view it in the Twitter mobile app. On your tweet’s detail page, you’ll see a “view tweet activity” button. Following it shows you how people engaged with your tweet, the percentage that favorited it compared to overall views, the total tweet impressions, and number of profile clicks it generated.

Tableau CEO says the company’s biggest challenge now is talent

In an interview with Gigaom on Thursday, Tableau Software CEO Christian Chabot spoke about the company’s $100 million third quarter and the challenges it faces as it continues to grow. He doesn’t suspect Salesforce.com’s new analytics service will be one of them.

Tableau hits the $100M revenue mark in third quarter

Tableau continues to grow at a fast pace, hitting the $100 million mark for the first time in the third quarter. Although there are plenty of startups willing to point out its weaknesses, Tableau has lots of room to grow and the money to do it.