The internet of things is becoming the next cloud battleground

Sensors and connected devices are popping up everywhere, and the data they’re producing has to be processed somewhere. While the easy stuff and the immediate stuff happens locally, more complex stuff — predictive analytics, visualizing data on mobile apps, talking to other devices or applications — happens in the cloud. And cloud computing providers are already beginning their fight to house all that data and all those workloads.

As it stands, the internet of things, like the web and mobile economies from which it grew, runs largely on Amazon Web Services. But there’s no guarantee the status quo will remain in place. As part of its broader home-automation plans, for example, Google is already buying up large AWS users such as Nest and Dropcam. Dropcam Co-founder and CEO Greg Duffy told me last year that his company runs “the largest inbound streaming service on the entire internet” — bigger than even YouTube. Assuming they eventually move onto Google’s infrastructure, AWS will lose both revenue and some banner use cases.

However, the competition — which stepped up to another level in the past week — won’t be won by M&A alone, or even by offering up the lowest prices on computing and storage. Cloud providers will also have to prove they’re the most capable platform for handling the particular needs of the internet of things. AWS has a stream-processing service called Kinesis that was no doubt created with connected devices in mind to some degree. Google’s new Cloud Dataflow service is also designed to process streaming data as it hits the network, and then analyze it more deeply later.

[soundcloud url=”″ params=”color=ff5500&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false” width=”100%” height=”166″ iframe=”true” /]

Download This Episode

Subscribe in iTunes

The Structure Show RSS Feed

Microsoft, meanwhile, might actually have the most-compelling internet of things service among the three largest cloud providers. It’s offering up in limited release a service called Azure Intelligent Systems Service, which the company claims will let users not just collect, store and process device data, but also connect devices and services and even manage them.

It seems probable that AWS and Google will eventually release similar services of their own. The savings in simplicity and latency that could result from having everything stored and managed on the same platform, by the same tools, and possibly within the same data center, could be too much to resist. Like many shifts in technology, it’s probably easy enough to get started with a single use case and maybe a single type of device or sensor, but companies might start begging for help as they begin measuring more stuff with new devices and complexity skyrockets.

I thought about this even more after the connected cities panel I led at our Structure Connect conference earlier this week (the audio from which is included in the podcast embed above, and the video of which is embedded below). We spoke about how cities are dipping their toes into the connected world by doing things such as measuring open parking spots and installing intelligent light bulbs. However, as City of San Jose, CIO Vijay Sammeta explained, his team is already looking into lots of new use cases, including vehicle-to-vehicle communications to prevent accidents and joining parking data with economic data to try bringing shoppers to areas that have both available parking and businesses in need of patrons.

From left to right: Vijay Sammeta, CIO, City of San Jose; Maciej Kranz, VP of Corporate Technology Group, Cisco; Charlie Catlett, Senior Computer Scientist, Argonne National Laboratory and University of Chicago; Derrick Harris, Senior Writer, Gigaom

From left to right: Vijay Sammeta, CIO, City of San Jose; Maciej Kranz, VP of Corporate Technology Group, Cisco; Charlie Catlett, Senior Computer Scientist, Argonne National Laboratory and University of Chicago; Derrick Harris, Senior Writer, Gigaom. Credit: Jakub Mosur

Charlie Catlett, a senior computer scientist at Argonne National Laboratory outside of Chicago, explained a project his institution is about to deploy that will measure things like air quality and other environmental factors. One goal of the project is to correlate that data with public health data in order to to help determine how external factors can affect health in different parts of Chicago. A 40-sensor deployment is rolling out in November, but Catlett said he has verbal commitment from the city to deploy 1,000 of them.

Cloud providers have been fighting fiercely for lucrative government accounts for years, over everything from city-government email to large private-cloud environments for the CIA. One of the big factors adding to the competitive nature is that so many governments now consider (some by mandate) cloud computing to be a good place to host new and non-critical workloads. As governments large and small start getting serious about quantifying the infrastructure and even the environments they manage, possibly deploying thousands of nodes at a time, they’re going to start looking for platforms to handle all that data and all that gear, and cloud providers are going to line up to deliver them.

[protected-iframe id=”4d382c08865fb970d675dbdfb9436775-14960843-6578147″ info=”” width=”640″ height=”360″ frameborder=”0″ scrolling=”no”]