Getting the real dope on your cloud deployment

Lots of companies can perform cost analysis of cloud-based instances; Krystallize Technologies promises to do more. The Austin, Texas–based startup said its technology delves deeper into what’s going on with cloud workloads and compares how a given job will do across Amazon, IBM SoftLayer, vCloud Air, Microsoft Azure, CenturyLink, etc. to help you decide where to run it.

If it works as advertised, it could be a big leg up for cloud deployers (or would-be cloud deployers) who are discovering that there is no one-size-fits-all cloud. There will be times when a private cloud running large instances is more cost-effective than a public cloud churning a ton of itty-bitty instances. The problem is that most of that is discovered now by trial and error — if it is discovered at all.

Krystallize CloudQOS, on the other hand, enables real capacity planning, according to founder and CEO Clinton France. To get to its CloudQOS index, Krystallize buys the instances in question and then applies its own technology.

“We’re the first to put a workload simulation engine and a performance statistic to measure what’s going on in the cloud,” France said in an interview.

I’d been hearing about Krystallize a bit already from people in the industry who were impressed with the pilot, which has been running for about six months. It also got some early press. One data center specialist was particularly impressed because Krystallize takes a lot of factors into account — the cloud resources used, how the components are integrated and how oversubscribed (or not) the hypervisor is in any given case.

Krystallize measures cloud environments down to a level that gets to what is the true performance of a cloud instance, this expert said. One example: “Most people don’t know that the internal clocks in cloud instances are not necessarily reality,” said the specialist, who did not want to be named because he works with a lot of the cloud providers.

“You can think of it as the clock on a Star Trek holodeck.  The clock in the holodeck can be slower or faster than real time.  To solve this, Krystallize adds its own clocking mechanism that runs in the cloud instance to get apples-to-apples comparison of what gets done over a period of time.”

The company can measure a given workload, with calculations per second on one axis and variability along the other to give a better representation of what a customer can expect to get for its money. Great if a given cloud can claim a zillion calculations per second; not so great if it only hits that high only once in awhile.

In one analysis,  Krystallize ran the same workload on [company]VMware[/company] vCloud Air private cloud, [company]Amazon[/company] Web Services and [company]Google[/company] public clouds. Given the specifics of this application, vCloud Air private showed the best performance for a given number of transactions (about 78,000 calculations per second) with fairly good variability. Google delivered just over 40K calculations with more variability and AWS performed just under 30K calculations but with less variability.

Krystallize private v public

In this case VMware looks best if the application requires all those transactions, but if the application only had to deliver 27,000 transactions or less, VMware as configured would use just a third of the resources allocated. This is why to properly gauge performance, you must understand both the application workload requirements and the cloud capabitilies private or public, France said.

The company has been self-funded to date but just landed a $1.2 million seed round from several unidentified angel investors.

France said the beauty is that Krystallize can look at service performance over time and monitor them to make sure the customer gets what she’s paying for. One value is to provide a price/performance index, and another is that Krystallize can help customers do “cloud pruning,” jettison resources that aren’t up to the task and redeploying workloads to resources that will handle them better.

That could help companies rid themselves of “shelfware” or cloud instances that are still up there but not really doing anything.

“This is like going to the vegetable stand and instead of taking the stuff off the top, going in to find the better, fresher fruit that may be at the bottom,” France said.

Remember when machine learning was hard? That’s about to change

A few years ago, there was a shift in the world of machine learning.

Companies, such as Skytree and Context Relevant, began popping up, promising to make it easier for companies outside of big banks and web giants to run machine learning algorithms and to do it at a scale congruent with the big data promise they were being pitched. Soon, there were many startups promising bigger, faster, easier machine learning. Machine learning became the new black as it became baked into untold software packages and services — machine learning for marketing, machine learning for security, machine learning for operations, and on and on and on.

Eventually, deep learning emerged from the shadows and became a newer, shinier version of machine learning. It, too, was very difficult and required serious expertise to do. Until it didn’t. Now, deep learning is the focus of numerous startups, all promising to make it easy for companies and developers of all stripes to deploy.

Joseph Sirosh

Joseph Sirosh

But it’s not just startups leading the charge in this democratization of data science — large IT companies are also getting in on the act. In fact, Microsoft now has a corporate vice president of machine learning. His name is Joseph Sirosh, and we spoke with him on this week’s Structure Show podcast. Here are some highlights from that interview, but it’s worth listening to the whole thing for his take on Microsoft’s latest news (including support for R and Python in its Azure ML cloud service) and competition in the cloud computing space.

You can also catch Sirosh — and lots of other machine learning and big data experts and executives — at our Structure Data conference next month in New York. We’ll be highlighting the newest techniques in taking advantage of data, and talking to the people building businesses around them and applying them to solve real-world problems.

[soundcloud url=”https://api.soundcloud.com/tracks/191875439″ params=”color=ff5500&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false” width=”100%” height=”166″ iframe=”true” /]

Download This Episode

Subscribe in iTunes

The Structure Show RSS Feed

Why the rise of machine learning and why now

“I think the cloud has transformed [machine learning], the big data revolution has transformed it,” Sirosh said. “But at the end of the day, I think the opportunity that is available now because of the vast amount of data that is being collected from everywhere . . . is what is making machine learning even more attractive. . . . As most of behavior, in many ways, comes online on the internet, the opportunity to use the data generated on interactions on websites and software to tailor customer experiences, to provide better experiences for customers, to also generate new revenue opportunities and save money — all of those become viable and attractive.”

Asked why whether all of this is possible without the cloud, Sirosh thinks it is, but — like most things —  it would be a lot more difficult.

“The cloud makes it easy to integrate data, it makes it easy to, in place, do machine learning on top of it, and then you can publish applications on the same cloud,” he said. “And all of this process happens in one place and much faster, and that changes the game quite a bit.”

Deep learning made easy and easier

Sirosh said he began his career in neural networks and actually earned his Ph.D. studying them, so he’s happy to see deep learning emerge as a legitimately useful technology for mainstream users.

“My take on deep learning is actually this,” he explained. “It is a continuing evolution in that field, we just have now gotten to the level where we have identified great algorithmic tricks that allow you to take performance and accuracy to the next level.”

Deep learning is also an area where Microsoft sees a big opportunity to bring its expertise in building easily consumable applications to bear. Azure ML already makes it relatively easy to train deep neural networks using the same types of methods as its researchers do, Sirosh noted, but users can expect even more in the months to come.

“We will also provide fully trained neural networks,” he said. “We have a tremendous amount of data in images and text data and so on inside of Bing. We will use our massive compute power to learn predictive models from this data and offer some of those pre-trained, canned neural networks in the future in the product so that people will find it very easy to use.”

A set of images that the Microsoft system classified correctly.

The results of a Microsoft computer vision algorithm it says can outperform humans at some tasks.

How easy can all of this really be?

As long as there are applications that can hide its complexity, Sirosh has a vision for machine learning that’s much broader than even the world of enterprise IT sales.

“Well, we are actually going after a broad audience with something like machine learning,” he said. “We want to make it as simple as possible, even for students in a high school or in college. In my way of thinking about it, if you’re doing statistics in high school, you should be able to use [a] machine learning tool, run R code and statistical analysis on it. And you can teach machine learning and statistical analysis using this tool if you so choose to.”

Is Microsoft evolving from an operating system company to a data company?

Not entirely, but Sirosh did suggest that Microsoft sees a shift happening in the IT world and is moving fast to ride the wave.

“I think you should even first ask, ‘How big is the world of data to computing itself?'” he said. “I would say that in the future, a huge part of the value being generated in the field of computing . . . is going to come from data, as opposed to storage and operating systems and basic infrastructure. It’s the data that is most valuable. And if that is where in the computing industry most of the value is going to be generated, well that is one place where Microsoft will generate a lot of its value, as well.”

AWS maintains lead in public cloud, but Azure inches forward

Amazon Web Services continues to dominate public cloud usage across the board, but Microsoft Azure is making strides at least in business accounts, according to a new RightScale survey.

[company]Amazon[/company] cloud adoption leads the pack with 57 percent of respondents reporting use of AWS (up from 54 percent last year) while 12 percent said they run [company]Microsoft[/company] Azure Infrastructure as a Service, up 6 percent from last year’s survey.

Among business or enterprise users, though, while AWS still leads with 50 percent, up slightly from 49 percent, Azure IaaS scored 19 percent, up from 11 percent.  [company]Rackspace[/company] and [company]Google[/company] App Engine are the next most popular clouds in this category, while vCloud Air logged 7 percent adoption, down from 18 percent. (Could the rebranding of vCloud Hybrid Services to vCloud Air have been a factor here?)

The Rackspace callout is interesting since the company said Tuesday it will stop breaking out public cloud and private cloud revenue and report them together. Rackspace is now focusing on private, managed cloud, in what some say shows it is ceding public cloud to the big guys.

RightScale Enterprise Cloud 2014-2015

All of these numbers are based on RightScale’s survey (downloadable here) of 930 cloud users, 24 percent of whom are RightScale customers.

Private cloud boosters won’t like this part: The new numbers show overall adoption of private cloud pretty much holding steady compared to last year. [company]VMware[/company] vSphere virtualized environments led with 53 percent of enterprise customers who reported that they use it as a private cloud. (Another 13 percent said they use vCloud Director as cloud.) This echoes last year’s survey in which many customers equated their virtualized server rooms with private cloud.

While private cloud appears to be in a bit of a swoon, it’s no surprise that Docker usage is hot. Per the survey, that containerization technology, while relatively new, is already used by 13 percent of respondents, while more than a third of the rest (35 percent) said they are planning to implement it.

Rightscale Public Clouds 2014OpenStack showed the greatest traction this year, with 13 percent adoption, growing by three percent year over year and still garnering big interest from companies whether they use it or not. A full 30 percent of respondents said they were evaluating or interested in using OpenStack over time. Microsoft’s relatively new Azure Pack showed a respectable seven  percent usage. Azure Pack, which mirrors Microsoft’s internal Azure usage, can run in a company’s own data centers or server rooms to provide an Azure-on-Azure hybrid.

Overall, Santa Barbara, California–based RightScale concluded from its research that cloud adoption is “a given” and hybrid cloud is the preferred mode of adoption. Of course RightScale offers multi-cloud management tools so that works out nicely for them.

RightScale VP of Marketing Kim Weins was our Structure Show guest after last year’s survey and had some interesting insights that might be helpful to compare and contrast. Check out the podcast below.

[soundcloud url=”https://api.soundcloud.com/tracks/143987938?secret_token=s-6kZD6″ params=”color=ff5500&auto_play=false&hide_related=false&show_artwork=true” width=”100%” height=”166″ iframe=”true” /]

Microsoft claims compliance with ISO data privacy standard

Microsoft says its compliance with a data privacy standard set by the International Organization for Standardization (ISO) means customer data in its Azure cloud will be safer from prying eyes.

The ISO/IEC 27018 standard aims to establish “a uniform, international approach to protecting privacy for personal data stored in the cloud,” Microsoft General Counsel and EVP Brad Smith wrote in a blog post.

A third-party, the British Standards Institute (BSI), has verified that Microsoft Azure as well as Office 365 and Dynamics CRM Online meet the ISO criteria, Smith noted.

Compliance means that the vendor’s customer controls her data and will know what’s happening with that data down the line. It also requires the vendor to implement strong security and restricts how data can be handled on public networks, transportable media etc. And, it means that data will not be used for advertising — which means that [company]Google[/company] is unlikely to climb aboard this particular bandwagon.

This is not an academic exercise for [company]Microsoft[/company] which is fighting U.S. court order to turn over customer data residing in its Dublin data center to U.S. authorities.

Cloud competitors are likely to call this a PR stunt — a concept that Microsoft is familiar with — but a security expert said ISO/IEC 27018 certification could become a major selling point to privacy obsessed consumers who balk at the notion that Google, because of its advertising business, uses customer data to sell stuff.

Said this expert, who requested anonymity because he works with both Google and Microsoft:  “Google would never agree to this since advertising is everything to them … Personally when I pay someone for a service, I expect my data to be private. When I use a service for free I accept that it is being paid for by sacrificing my privacy.”

For more on Microsoft’s data privacy stance, see Smith’s talk at last year’s Structure show below.

[youtube https://www.youtube.com/watch?v=6ncpPRqAJpc]

Microsoft woos Y Combinator startups with big Azure credits

Microsoft wants to boost its cloud’s profile among startups so it’s making $500,000 in Azure credits available to Y Combinator-backed companies.

The credits start rolling with the Winter 2015 class and will continue after that, according to this Y Combinator blog post. This can be a good number of companies — there were 106 companies in the Spring and Winter 2014 classes, for example.

Cloud credits are ubiquitous — Y Combinator has special hosting offers from [company]Amazon[/company], [company]Google[/company], [company]Rackspace[/company] and now [company]Microsoft[/company], according to Y Combinator president Sam Altman. But, $500K is a big number. (Oh, and the startups will also get three years of Office 365 subscription, “access to Microsoft developer staff,” plus a year of CloudFlare and DataStax services.

Qualified startups can typically get $1,000 to $15,000 in Amazon Web Services (AWS) credits, and there are other freebies available. Then, in September, things started going a bit haywire. Google started offering $100,000 in Google Cloud Platform credits to qualified startups. Two months later [company]IBM[/company] upped the ante to  $120,000 in credit for SoftLayer infrastructure or BlueMix PaaS. Again all for “qualified” startups.

This is a strategic gambit for Microsoft, which wants to get more young companies — many of which are probably not Windows focused — to check out Azure. It’s also a way to chip away at [company]Amazon[/company] Web Services’ prodigious lead among startups. AWS is pretty much the default cloud selection for young companies.

This story was updated at 5:24 a.m. PST February 11 to reflect that AWS typically provides qualified startups with up to $15K in promotional funding.

Why opening up its Cosmos big data system would be the right move for Microsoft

There has been a rumor floating around since August (first, and subsequently, reported by Mary Jo Foley at ZDNet) that Microsoft is preparing to release its Cosmos big data system as a service on its Azure cloud platform, likely as an alternative to Hadoop. That would not only be a bold move for Microsoft, but also, probably, a smart one.

We’ll get to that shortly, but first a brief history of Cosmos. It’s Microsoft’s internal big data system, used to store and process data for applications such as Bing, Internet Explorer and email. Cosmos’ batch-computing element is called Dryad, and it’s similar to — although reportedly much more flexible than — the MapReduce framework associated with Hadoop as well early-2000s Google, where MapReduce was invented. Cosmos also features a SQL-like query engine called SCOPE.

As of early 2011, Microsoft claimed it was storing about 62 petabytes of data in Cosmos. Around that timeframe, Microsoft was also previewing commercial versions of Cosmos/Dryad and pitching it as a better alternative to Hadoop. On the whole, reviews were positive. You can read more about Cosmos here and here.

A graphic showing Cosmos' place in the application architecture, circa 2011.

A graphic showing Cosmos’ place in the application architecture, circa 2011.

Around October 2011, Microsoft began investing heavily in making Hadoop run on Windows, an early — and wise — indication of the company’s move toward embracing both open source technologies as well as technologies that companies and developers have shown an interest in using. The Dryad work was moved to the Windows high-performance computing product line, where it eventually died. In February 2012, Microsoft went whole hog on Hadoop, announcing a partnership with Hortonworks and forthcoming products for both Windows Servers and Azure.

The company has been pretty quiet about Cosmos, Dryad and the whole shebang in the meantime, but in August, ZDNet‘s Foley reported on a Microsoft job posting that suggests the company is developing a version of Cosmos for external consumption. In late January, she expanded on the original report with information that Microsoft is recruiting testers for the Cosmos service, as well as citing improvements to the system’s SQL engine and new storage and computing components.

Microsoft declined to comment about any plans to release any products based on Cosmos.

Microsoft CEO Satya Nadella speaks at a Microsoft cloud event. Photo by Jonathan Vanian/Gigaom

Microsoft CEO Satya Nadella talks about open source at a Microsoft cloud event.

If Microsoft is indeed preparing a Cosmos service on Azure, it’s easy to see why. Data processing and analysis is going to be a major driver of IT spending in the coming decades, and smart companies are going to cover all their bases when it comes to serving customers’ needs. Hadoop is just the platform that every cloud provider, database vendor and analytics vendor needs to support because the community is so large and so many workloads already run on it.

But that doesn’t mean Hadoop is necessarily the best technology for every job, especially for cloud providers that want to control every aspect of a new service from the core code up to the user interface. Google’s Compute Engine platform supports Hadoop, but the company all but said “Hadoop is passé” when it rolled out its post-Hadoop Cloud Dataflow service in June. Databricks, a startup based around the Apache Spark technology, works very closely to integrate Spark with the Hadoop ecosystem but is banking its business on a cloud service that’s all about Spark.

If the Apache Storm stream-processing project was as popular as Hadoop is, perhaps Amazon Web Services would have built something around it rather than starting with its own stream-processing technologies, Kinesis and Lambda. Microsoft, in fact, is also now touting its own stream-processing engine called Trill that already underpins the company’s Azure Stream Analytics service, as well as streaming workloads for the backend systems powering Bing and Halo.

Comparing Trill to other streaming engines.

Comparing Trill to other streaming engines.

We will discuss the business of big data in detail at our Structure Data conference, which takes place March 18 and 19 in New York. Speakers include the CEOs of Hadoop vendors Cloudera, Hortonworks and MapR, as well as executives from Google, Microsoft and Amazon Web Services. And, of course, we will have some of the world’s most-advanced users talking about the tools they use and what they would like to see from the companies selling software.

And new data services, especially among cloud providers,are also about showing off a company’s technological chops, much like their boasts about the number of data centers they own. Engineers like to work on the biggest and best systems around, and developers like to build applications on them. Much like Google has open sourced bits and pieces of its infrastructure in the form of Kubernetes and some Cloud Dataflow libraries, it won’t surprise me if Microsoft decides to open source parts of Cosmos and Trill at some point — perhaps to help drive more interest around its recently open sourced .NET development framework.

There is too much money up for grabs in the cloud computing and big data markets to leave good technology locked up inside a company’s internal towers. As Microsoft, Google and Amazon seek to grab as many cloud workloads as they can and to hire as many talented engineers as they can — in a competitive market that also includes very open-source-friendly companies such as Facebook and Netflix — expect to see a lot more openness about the stuff they’re building, as well as a lot more services based on it.

Microsoft’s big task: Replicating Windows-Office success in cloud

Microsoft turned in pretty good second quarter results on Monday, but its stock still took a hit after hours and into Tuesday. At time of posting, Microsoft shares were off 8.6 percent to $42.97 from Monday’s close of $47.01.

Why? After the earnings call, [company]Microsoft[/company] watchers seemed to remember that the company cash cows remain good old-fashioned Office and Windows, sales of which aren’t setting the world on fire. Sales in the company’s Commercial Division, which includes those products, missed expectations, logging “just” $10.68 billion for the quarter compared to the $10.94 billion that FactSet analysts had expected, according to Marketwatch.

Another data point: Revenue for Windows OEM versions of the operating system — which get pre-loaded on new PCs  — fell 13 percent year over year. And Windows Volume licensing revenue grew just 3 percent, as CRN pointed out.

Monday’s call was characterized as the end of the honeymoon for Microsoft CEO Satya Nadella, who took the reins in February, 2014. I don’t know about that, but there does seem to be a growing realization that replicating the wild success Microsoft had selling tons of copies of Office and Windows — either through volume licenses to big companies or at retail — will be a tough task in the cloud era where Microsoft was not first out of the gate.

On Monday’s earnings call,  Nadella acknowledged that Windows suffered a tough year-to-year comparison noting that: “As expected, the one-time benefit of Windows XP end-of-life PC refresh cycle has tailed off.” He was referring to the fact that Microsoft stopped supporting the popular XP operating system in  April 2014, and that publicized deadline probably pulled many PC purchases forward from this year.

Microsoft CFO Amy Hood cited that “comparability issue” as a headwind that will “show itself most directly as weakness in commercial licensing and most specifically as weakness in Office transactional licensing.” Transactional business refers to sales of full-licensed software to run on a PC or server, as opposed to the more incremental subscription Software-as-a-Service (SaaS) model.

Fading glory?

So Office and Windows sales took a hit because they did well last year, but there’s more to this concern than the lingering impact of Windows XP’s demise.

At the heart of Microsoft’s problem is that, for many companies, it is just not the brand it once was. Whereas people of my era grew up relying on Microsoft Office applications, startups and young employees are less likely to use them. Google Apps is more likely their productivity suite. So [company]Google[/company] is strong in small companies, but it has also been aggressive courting enterprise accounts.

That means that, in this SaaS era, Microsoft faces a formidable name-brand competitor that can compete with it on price. The days of Microsoft being able to demand a premium price are over.

The company is doing lots of smart things to try to remedy the attention deficit among young companies. Its decision to go “freemium ” with PowerBI is smart, but then again the prospect of free software won’t endear it to Wall Street analysts, a constituency that was thrilled to see Nadella replace Steve Ballmer at CEO.

Sure, Microsoft can claim progress in cloud with Azure infrastructure as a service — but that comes from a small base of users compared to the [company]Amazon[/company] Web Services juggernaut. So when it says its cloud business hit a $5 billion run rate in the second quarter it’s worth noting, but with a grain of salt. Cloud numbers from legacy vendors are fluffy at best as they typically include a lot of legacy stuff and services thrown in. Last week, [company]IBM[/company] claimed its cloud business hit its $7 billion target, a claim that met much skepticism.

Microsoft will have to keep investing in key new technologies if it’s going to build relevance for modern companies. The goal is to make sure that PowerBI or some other Microsoft product becomes as essential to a big class of users as Excel and Word did 20 to 30 years ago. And Nadella has opened up the check book — Microsoft has almost $89 billion in cash after all. Last week, Microsoft acquired Revolution Analytics, the company that backs the R language used by many data scientists.

Big data and machine learning will be key areas — which you can hear more about from Joseph Sirosh, corporate VP for machine learning who will speak at Structure Data in New York City in March.

MSFT Price Chart

MSFT Price data by YCharts

Microsoft cites key cloud and mobile segments for good Q2

Microsoft is claiming strong cloud-and-mobile growth in its second-quarter earnings release. The company which is playing catchup in cloud now claims a $5.5 billion run rate there, up from the $4.4 billion it claimed last quarter.

On Monday the company reported diluted earnings per share (EPS) of $0.71 on revenue of $26.5 billion for period ending December 31, 2014, meeting consensus estimates of $0.71 EPS on revenue of $26.33 billion.

In its release, the company said for its critical cloud segment — which includes Azure as well as Software-as-a-Service products like Office 365 — said revenue grew 114 percent year over year. Revenue in the devices and consumer segment grew 8 percent to $12.9 billion.

But, [company]Microsoft[/company] still derives a good chunk of its profits from the still-huge-but-not-fast growing PC market and there its age is showing. Revenue for Office commercial products and related services fell 1 percent, dinged by the transition to the SaaS-based Office 365 and declining PC sales.

Microsoft is claiming big time cloud momentum, much as IBM and SAP did last week. But it’s hard to suss out these numbers since the vendors all glom a lot of different things — including pieces of legacy software and services — into the category.

And, Microsoft’s issue is that it’s moving to cloud while from its traditional strength position in more sluggish on-premises software. It’s a tricky path — as it sells more SaaS software, it sells less of the pricier on-premises stuff. But then again, if it didn’t cannibalize its own business, someone else — [company]Google[/company]? [company]Salesforce.com[/company]? — would eat it anyway.

 

This story will be updated throughout today’s earnings call with Microsoft CEO Satya Nadella.

 

[dataset id=”909692″]

 

[dataset id=”909693″]

Robots embrace Ubuntu as it invades the internet of things

Canonical has revealed what I reckon is its biggest announcement in years: Ubuntu is about to invade the internet of things with a minimal version of the Linux distribution that it hopes will provide a standardized platform for connected devices from drones to home hubs.

“Snappy” Ubuntu Core came out of [company]Canonical[/company]’s mobile efforts (which are yet to go anywhere) and was made available on [company]Amazon[/company] Web Services, [company]Microsoft[/company] Azure and the [company]Google[/company] Cloud Platform at the end of 2014. Now it’s available for smart devices, and Canonical has already got players such as the Open Source Robotics Foundation (OSRF), drone outfit Erle Robotics and connected hub maker NinjaBlocks on board.

From mobile to IoT, via the cloud

Unlike traditional, package-based Ubuntu for servers and desktops, the extensible Core keeps apps and each part of the OS securely isolated from one another, and it allows for “transactional updates” — they only need to include the difference between the old and new version, allowing for easy upgrading and rolling-back if needed. In the cloud, Canonical is pushing Ubuntu Core as ideal for Docker and other containerized apps.

Mark Shuttleworth

Mark Shuttleworth


However, Core’s suitability for the container trend was more or less an accidental bonus while the technology was quietly making its way from Ubuntu Touch to the internet of things, Canonical founder Mark Shuttleworth told me in an interview. According to Shuttleworth, Core’s development began as Canonical grappled with carriers’ annoyance at existing mobile firmware update mechanisms, and as cheap development systems such as Raspberry Pi and Arduino started to take off.

[pullquote person=”Mark Shuttleworth” attribution=”Mark Shuttleworth, Canonical founder” id=”907873″]Let us deliver those updates to your device with the same efficiency as with a phone[/pullquote]”Two years ago we started seeing a lot of what I’d call alpha developers starting to tinker with what at the time people called embedded development,” Shuttleworth said. “We realized there was a very interesting commonality between the work we were doing for mobile — specifically this update mechanism work – and the things you’d want if you were to build a product around one of these boards.”

Canonical had “invested in the container capabilities of the Linux kernel as it happened for the mobile story,” Shuttleworth said, as it was needed to fix security issues on the phone, such as isolating untrusted apps from the address book. “Docker is based on those primitives that we built,” he noted.

Developer push

For makers of connected devices, the same technology means being able to concentrate on the connected app and keeping the device more secure. “[Currently] if you’re going to get an update for that firmware, what you’re getting is a whole blog of kernel and OS and app, and the net effect is you rarely get them, so a lot of devices are vulnerable,” Shuttleworth said. “With Core, you can let us worry about Heartbleed and so on, and let us deliver those updates to your device with the same efficiency as with a phone.”

What’s more, Core for smart devices comes with an app store (that can be white-labeled for brands) that provides developers with a distribution mechanism, and also opens up the possibility of running different apps from different vendors on connected devices.

Shuttleworth gave the example of a smart lawnmower that could take an add-on spectral camera from a different manufacturer and run that manufacturer’s app:

It’s going from a single-device stodgy world to more cross-pollination between devices from different vendors. Because you have a store, you can see more innovation where people concentrate on the software – they don’t have to build a whole device. Because it’s a common platform, they can deliver that app to many devices.

One of the key benefits of Core is its flexibility. The base Ubuntu Core code is identical across the cloud, connected devices and even the desktop – it supports both ARM and x86. This means device makers can prototype their “Snappy” apps on a PC before running thousands of simulations in the cloud, and it also means old PCs can be easily repurposed as a home storage server or photo booth or what have you.

Early adopters

The OSRF is going to use Ubuntu Core for its new app store, so developers can push updates to their open robots. Erle Robotics is using Core to power its new Erle-Copter open educational drone (pictured above), which will ship in February.

NinjaBlocks' Ninja Sphere smart home controller

NinjaBlocks’ Ninja Sphere smart home controller

NinjaBlocks’ is using Core and its app store as the basis for its new Ninja Sphere smart home controller (pictured right).

Shuttleworth said he was intrigued by the possibilities of hubs: “They may be routers or set-top boxes [but] you really want to think of them as extensible. Why can’t a NAS also have facial recognition capabilities; why can’t your Wi-Fi base station also run a more sophisticated firewall?”

The current Raspberry Pi won’t run Ubuntu Core as it uses the older ARMv6 architecture – Core requires ARMv7, though the ODroid-C1 provides a cheap ($35) option in that department. “We decided we wouldn’t go to lower specifications because our Core story is the next generation of devices,” Shuttleworth said.

Speaking of hardware, the Ubuntu founder also hinted that there might be further announcements in connection with the big silicon vendors, with which Canonical already has extensive relationships – “At the silicon level we’re a unifying factor” — though he didn’t want to go into detail just yet. The likes of Intel and Samsung and Qualcomm are all trying to develop their own (infuriatingly disparate) standards for the internet of things, and it would be interesting to see how Canonical can insert itself into this chaotic land-grab, if indeed it can.

Ubuntu’s future

For those wishing to repurpose old PCs, the private cloud storage outfit OwnCloud (already available in the Core app store) provides an interesting test case for the difference between Ubuntu Core and the full-fat Ubuntu. As Shuttleworth tells it, OwnCloud “got bitten” by the traditional package management system on Ubuntu, because that involves different packages for different versions of the OS.

“It came to the question of who’s responsible for an out-of-date, insecure version of OwnCloud,” he said. “We can’t usually give [developers] access rights to the archive to push updates – if something malicious is in there… it can go anywhere. [Now] we can say: ‘OK, there’s just one place you push the latest version of OwnCloud and it goes directly to every device with Snappy.’ If they were to do something malicious, we’d confine that to just the data you’ve already given to OwnCloud.”

So, is Core Ubuntu’s WinCE or the future of the venerable Linux distro? Shuttleworth was adamant that the Debian-package version of Ubuntu “will never go away because it’s the mechanism with which we collaborate amongst ourselves and with Debian” and would be of continued relevance for developers:

The question comes when you look to shipping the software to a device or user – folks are increasingly comfortable with the idea that a more bundled, precise and predictable delivery mechanism is attractive for that. I think there will be millions of people using Snappy, but I don’t think the package-based version will go away. It’s so useful for developers and in many cases for production, but in cases where you have a particular property of very high cost to go fix something if it breaks, the Snappy system is very attractive.

For any given application, it’s clear which would be better.