Why unikernels might kill containers in five years

Sinclair Schuller is the CEO and cofounder of Apprenda, a leader in enterprise Platform as a Service.
Container technologies have received explosive attention in the past year – and rightfully so. Projects like Docker and CoreOS have done a fantastic job at popularizing operating system features that have existed for years by making those features more accessible.
Containers make it easy to package and distribute applications, which has become especially important in cloud-based infrastructure models. Being slimmer than their virtual machine predecessors, containers also offer faster start times and maintain reasonable isolation, ensuring that one application shares infrastructure with another application safely. Containers are also optimized for running many applications on single operating system instances in a safe and compatible way.
So what’s the problem?
Traditional operating systems are monolithic and bulky, even when slimmed down. If you look at the size of a container instance – hundreds of megabytes, if not gigabytes, in size – it becomes obvious there is much more in the instance than just the application being hosted. Having a copy of the OS means that all of that OS’ services and subsystems, whether they are necessary or not, come along for the ride. This massive bulk conflicts with trends in broader cloud market, namely the trend toward microservices, the need for improved security, and the requirement that everything operate as fast as possible.
Containers’ dependence on traditional OSes could be their demise, leading to the rise of unikernels. Rather than needing an OS to host an application, the unikernel approach allows developers to select just the OS services from a set of libraries that their application needs in order to function. Those libraries are then compiled directly into the application, and the result is the unikernel itself.
The unikernel model removes the need for an OS altogether, allowing the application to run directly on a hypervisor or server hardware. It’s a model where there is no software stack at all. Just the app.
There are a number of extremely important advantages for unikernels:

  1. Size – Unlike virtual machines or containers, a unikernel carries with it only what it needs to run that single application. While containers are smaller than VMs, they’re still sizeable, especially if one doesn’t take care of the underlying OS image. Applications that may have had an 800MB image size could easily come in under 50MB. This means moving application payloads across networks becomes very practical. In an era where clouds charge for data ingress and egress, this could not only save time, but also real money.
  2. Speed – Unikernels boot fast. Recent implementations have unikernel instances booting in under 20 milliseconds, meaning a unikernel instance can be started inline to a network request and serve the request immediately. MirageOS, a project led by Anil Madhavapeddy, is working on a new tool named Jitsu that allows clouds to quickly spin unikernels up and down.
  3. Security – A big factor in system security is reducing surface area and complexity, ensuring there aren’t too many ways to attack and compromise the system. Given that unikernels compile only which is necessary into the applications, the surface area is very small. Additionally, unikernels tend to be “immutable,” meaning that once built, the only way to change it is to rebuild it. No patches or untrackable changes.
  4. Compatibility – Although most unikernel designs have been focused on new applications or code written for specific stacks that are capable of compiling to this model, technology such as Rump Kernels offer the ability to run existing applications as a unikernel. Rump kernels work by componentizing various subsystems and drivers of an OS, and allowing them to be compiled into the app itself.

These four qualities align nicely with the development trend toward microservices, making discrete, portable application instances with breakneck performance a reality. Technologies like Docker and CoreOS have done fantastic work to modernize how we consume infrastructure so microservices can become a reality. However, these services will need to change and evolve to survive the rise of unikernels.
The power and simplicity of unikernels will have a profound impact during the next five years, which at a minimum will complement what we currently call a container, and at a maximum, replace containers altogether. I hope the container industry is ready.

Survey: Experience with Cloud and Containers

Containers are a hot topic in the developer/software engineering community. Gigaom Research is conducting this survey to gauge just how widespread the buzz is and if containers are in fact delivering real business value.

Time’s up! Changing core IT principles

There is a theme gaining ground within IT organizations. In truth, there are a number of examples that support a common theme coming up for IT organizations. And this very theme will change the way solutions are built, configured, sold and used. Even the ecosystems and ancillary services will change. It also changes how we think, organize, lead and manage IT organizations. The theme is:

Just because you (IT) can do something does not mean you should.

Ironically, there are plenty of examples in the history of IT where the converse of this principle served IT well. Well, times have changed and so must the principles that govern the IT organization.

Take it to the customization of applications and you get this:

Just because IT can customize applications to the nth degree does not mean they necessarily should.

A great example of this is in the configuration and customization of applications. Just because IT could customize the heck out of it, should they have? Now, the argument often made here is that it provides some value, somewhere, either real or (more often) perceived. However, the reality is that it comes at a cost, sometimes, a very significant and real cost.

Making it real

Here is a real example that has played out time and time again. Take application XYZ. It is customized to the nth degree for ACME Company. Preferences are set, not necessarily because they should be, but rather because they could. Fast-forward a year or two. Now it is time to upgrade XYZ. The costs are significantly higher due to the customizations done. It requires more planning, more testing, more work all around. Were those costs justified by the benefit of the customizations? Typically not.

Now it is time to evaluate alternatives for XYZ. ACME builds a requirements document based on XYZ (including the myriad of customizations). Once the alternatives are matched against the requirements, the only solution that really fits the need is the incumbent. This approach actually gives significant weight to the incumbent solution therefore limiting alternatives.

These examples are not fictitious scenarios. They are very real and have played out in just about every organization I have come across. The lesson here is not that customizations should be avoided. The lesson is to limit customizations to only those necessary and provide significant value.

And the lesson goes beyond just configurations to understanding what IT’s true value is based on what they should and should not do.

Leveraging alternative approaches

Much is written about the value of new methodologies and technologies. Understanding IT’s true core value opportunity is paramount. The value proposition starts with understanding how the business operates. How does it make money? How does it spend money? Where are the opportunities for IT to contribute to these activities?

Every good strategy starts with a firm understanding of the ecosystem of the business. That is, how the company operates and it’s interactions. A good target that many are finding success with sits furthest away from the core company operations and therefore hardest to explain true business value…in business terms. For many, it starts with the data center and moves up the infrastructure stack. For a bit more detail: CIOs are getting out of the data center business.

Preparing for the future today

Is your IT organization ready for today? How prepared is your organization, processes and systems to handle real-time analytics? As companies consider how to engage customers from a mobile platform in real-time, the shift from batch-mode to real-time data analytics quickly takes shape. Yet many of the core systems and infrastructure are nowhere ready to take on the changing requirements.

Beyond data, are the systems ready to respond to the changing business climate? What is IT’s holistic cloud strategy? Is a DevOps methodology engaged? What about container-based architectures?

These are only a few of the core changes in play today…not in the future. If organizations are to keep up, they need to start making the evolutionary turn now.

You can now store Docker container images in Google Cloud

Google Cloud users can now load up their private Docker container images into the search giant’s new Google Container Registry, which Google said Friday is now available in beta and the company noted “is not covered by any SLA or deprecation policy and may be subject to backward-incompatible changes.”

If you are a [company]Google[/company] Cloud customer, your [company]Docker[/company] container images — which contain all the necessary components for spinning up containers, like the source code and binary files — will be “automatically encrypted before they are written to disk,” according to the Google blog post detailing the registry.

From the blog post:
[blockquote person=”Google” attribution=”Google”]Access control: The registry service hosts your private images in Google Cloud Storage under your Google Cloud Platform project. This ensures by default that your private images can only be accessed by members of your project, enabling them to securely push and pull images through the Google Cloud SDK command line. Container host VMs can then access secured images without additional effort.

Google said that with the container images loaded up in the Google cloud and cached in its data centers, users should be able to deploy them to Google Container Engine clusters as well as “Google Compute Engine container-optimized VM’s.”

As for pricing, Google said that while the service is in beta, users “will be charged only for the Google Cloud Storage storage and network egress consumed by your Docker images.”

This seems like part of Google’s strategy to hype up its Google Container Engine, which is the managed-service version of the open-source Kubernetes container-management system. Instead of storing your private containers in the Docker Hub or CoreOS’s Enterprise Registry, Google wants users to trust it with holding on to the valuables.

For now, the Google Container Engine only allows users to craft managed clusters within its system and “It doesn’t have the ability to span across multiple cloud providers,” said Greg DeMichillie, Google’s director of product management for its cloud platform, during the announcement of the container engine last November.

Google is Adding a Private Registry to its Docker Arsenal

Google, continuing its investment in containers and cluster management, is swiftly building a private Docker registry offering for its customers. Given the importance of security and compliance, enterprises have been reluctant to use publicly accessible Docker repositories. Private registries enable secure and rapid storage and retrieval of Docker images. We will be testing this out in the coming weeks.

Google was one of the first public cloud providers to offer container hosting and cluster management capabilities. It started with Container Optimised VMs followed by Managed VMs, Kubernetes and finally Google Container Engine (GKE). Despite these improvements, customers still had to store Docker images on the public Docker Hub or create a private registry in one of the VMs.

This process will be eliminated when Google unveils Google Container Registry hosted on Google Cloud Platform. DevOps teams will be able to pull and push images from the registry on the same infrastructure. Google Container Registry is integrated with Google Accounts. It exposes an HTTP endpoint at gcr.io that is accessible within its cloud platform or on-premises infrastructure. Container images are stored in a Google Cloud Storage bucket. When an image is pushed for the first time, a dedicated bucket is created within the same Google account to store the image. Owners and admins of the project can pull and push the images while users with project viewer permission can only pull images. The command line utility of Google Cloud Platform, gcutil is updated to support pull and push operations. Images stored in Google Container Registry can be used from Container Optimized VMs, Managed VMs, Kubernetes, and Google Container Engine Pods.

Google Container Registry

Google Container Registry - Source: Gigaom

Source: Gigaom Research

Other vendors serious about Docker and containers are also investing in private registries. CoreOS acquired Quay.io, a hosted private docker repository company and Tutum, a Docker hosting platform also offers a private registry. Docker, Inc. acquired Koality to augment its enterprise hub offering. Koality’s speciality was continuous integration and deployment of containerized applications. By integrating CI/CD with its native registry, Docker, Inc. can attract enterprise customers.

Docker Hub Enterprise (DHE) was announced at DockerCon Europe 2014. DHE delivers workflow capabilities for developers and sysadmins managing a dynamic lifecycle behind the enterprise firewall. DHE is a drop-in solution that allows enterprise developers to focus on creating multi-container distributed applications behind-the-firewall. DHE’s first release comes with an installer, GUI configuration, resumable push/pull of images, flexible storage capability with support for local filesystem, in-memory and Amazon S3.

AWS, IBM, and Microsoft host DHE on their respective public cloud offerings. IBM pledged its support to integrate DHE with SoftLayer and Bluemix while Microsoft will host DHE natively on Azure. AWS will offer DHE as an appliance through its Test Drive program. It may eventually get listed in AWS Marketplace. While this seems like just another partnership announcement, there is more to it: Google is conspicuously missing from the list. Google had clear plans to build a complete container platform with private registry as the cornerstone of its strategy. This made Google opt out of DHE partnership.

The Gigaom Research Perspective

It is clear that Google has a dual strategy when it comes to containers.

1) Embrace Docker – Google has been running containers for a long time. Instead of exposing its internal toolchain for managing the lifecycle of containers, it decided to support Docker, which has a vibrant community and ecosystem of developers. It then open sourced Kubernetes, a cluster management and orchestration tool that enjoyed huge popularity among Docker users. Meanwhile, Google started adding native Docker support to App Engine and Compute Engine making it easy for developers to launch and manage containers on its public cloud. Google wants its cloud platform to be the best public cloud to run Docker containers.

2) Monetize Container Building Blocks – Docker is the most successful open source project after Linux. There are over a hundred startups building tools and components around Docker but it is still not clear how these startups will eventually make money. Docker, Inc. is busy assembling all the key building blocks to make its platform complete for enterprise customers. With early investments made in containers, Google doesn’t want to miss the opportunity of commercialising its intellectual property. While Kubernetes is open source and available on a variety of cloud platforms, Google Container Engine abstracts it further, delivering a simplified experience of deploying and managing clusters. When developers use GKE, they indirectly consume compute, storage, and database services. Container registry is an important step towards technologies such as Rocket and LXD on its platform. It will certainly impact Docker, Inc. and its ecosystem.

Changing the CIO conversation from technology to business

For many years, traditional IT thinking has served the IT function well. Companies have prospered from both the technological advances and consequent business improvements. Historically, the conversation typically centered on some form of technology. It could have been about infrastructure (data centers, servers, storage, network) or applications (language, platform, architectures) or both.

Today, we are seeing a marked shift in the conversations happening with the CIO. Instead of talking about the latest bell-and-whistle, it is increasingly more apt to involve topics about business enablement and growth. The changes did not happen overnight. For any IT leader, it takes time to evolve the conversation. Not only does the IT leader need to evolve, but so does their team and fellow business leaders. Almost two years ago, I wrote about the evolution of these relationships in Transforming IT Requires a Three-Legged Race.

Starting the journey

For the vast majority of IT leaders, the process is not an end-state, but rather a journey about evolution that has yet to start in earnest. For many I have spoken with, there is an interest, but not a clear path in which to take.

This is where an outside perspective is helpful. It may come from mentors, advisors or peers. It needs to come from someone that is trusted and objective. This is key, as the change itself will touch the ethos of the IT leader.

The assessment

Taking a holistic assessment of the situation is critical here. It requires a solid review of the IT leadership, organizational ability, process state and technological situational analysis. The context for the assessment is back to the core business strategy and objectives.

Specific areas of change are items that clearly are not strategic or differentiating to support the company’s strategy and objectives. A significant challenge for IT organizations will be: Just because you can manage it, does not mean you should manage it.

Quite often, IT organizations get too far into the weeds and loose sight of the bigger picture. To fellow business leaders, this is often perceived as a disconnect between IT & Line of Business (LoB) leaders. It essentially alienates IT leaders and creates challenges to fostering stronger bonds between the same leaders.

Never lose sight of the business

It is no longer adequate for the CIO to be the only IT leader familiar with the company’s strategy and objectives. Any IT leader today needs to fully understand the ecosystem of how the company makes and spends money. Without this clarity, the leader lacks the context in which to make healthy, business-centric decisions.

The converse is an IT leader that is well familiar with the business perspective as outlined above. This IT leader will gain greater respect amongst their business colleagues. They will also have the context in which to understand which decisions are most important.

Kicking technology to the curb

So, is IT really getting out of the technology business? No! Rather, think of it as an opportunity to focus. Focus on what is important and what is not. What is strategic for the company and what is not? Is moving to a cloud-centric model the most important thing right now? What about shifting to a container-based application architecture model? Maybe. Maybe not. There are many areas of ripe, low hanging fruit to be picked. And just as with fruit, the degree of ripeness will change over time. You do not want to pick spoiled fruit. Nor do you want to pick it too soon.

One area of great interest these days is in the data center. I wrote about this in detail with CIOs are getting out of the Data Center business. It is not the only area, but it is one of many areas to start evaluating.

The connection between technology divestiture and business

By assessing which areas are not strategic and divesting those area, it provides IT with greater focus and the ability to apply resources to more strategic functions. Imagine if those resources were redeployed to provide greater value to the company strategy and business objectives. By divesting non-strategic areas, it frees up the ability to move into other areas and conversations.

By changing the model and using business as the context, it changes the tone, tenor and impact in which IT can have for a company. The changes will not happen overnight. The evolution of moving from technology to business discussions takes vision, perseverance, and a strong internal drive toward change.

The upside is a change in culture that is both invigorating and liberating. It is also a model that supports the dynamic changes required for today’s leading organizations.

What’s big in venture capital: Security, security, security

Steve Herrod has seen a lot in the enterprise IT space, having spent 12 years at VMware — the last several years as CTO and vice president of R&D — before leaving in 2013 to join venture capital firm General Catalyst Partners. And right now, after seemingly dozens of high-profile cyberattacks in as many months, Herrod has security on his mind. He came on the Structure Show podcast this week to tell how he’s thinking about that space.

Here are a few choice quotes from the interview (including about Docker and the pace of tech innovation), but it’s definitely worth hearing the whole thing. Herrod offers up a lot more thoughts on the cybersecurity. as well as cloud computing, containers and the public appetite for tech IPOs.

[soundcloud url=”https://api.soundcloud.com/tracks/184974081?secret_token=s-DR28d” params=”color=ff5500&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false” width=”100%” height=”166″ iframe=”true” /]

Download This Episode

Subscribe in iTunes

The Structure Show RSS Feed

Security as corporate strategy

“For me, it’s the first time that sort of infrastructure issues are coming up at board-of-director meeting for completely non-technology companies,” Herrod said. “Nobody wants to be in the news whatsoever, and the cost of these break-ins is obviously ridiculously high.”

There is no enterprise force field

“I think you should just assume bad things are going to happen, so let’s think about how to mitigate or restrict how bad they can be,” Herrod explained.

“Forget perimeter,” he added. “Let’s think about how do we wrap data, how do we write applications, how do we use identity as the very core, regardless of where we’re accessing things?”

Mobilize 2011: Stacey Higginbotham – Senior Writer, GigaOM; Stephen Herrod – CTO, VMware

Steve Herrod at Structure 2011.

It’s time to give the cloud it’s due on security

“If you meet the ops teams and the groups that are there building these clouds, I think they’re far more secure than what’s going on in the private data centers because they have much more-intensive staffs,” Herrod said. “These guys have gone through every audit — they’re the superset of every audit that their customers have to go through, and they see the most-sophisticated attacks and thus have to do a lot of work behind it.”

2014 was the year of the container; 2015 will be a year of awakening

“Last year was the year of Docker awareness. I think it was the most-publicized open source thing since OpenStack,” Herrod said. “… But I think this is the year you see the hype die down and kind of the realities of what it means to use these containers [and] the fighting that will go on between a bunch of different vendors offering the best approach to containers.”

Keeping up with new tools is a full-time job

“I actually see all the time the developer back channels on what is the most-productive toolset or what is the coolest way to build this new type of startup company,” Herrod said. “That travels super-quickly through conferences, through articles, through social networks, and thus I think you get this herd mentality moving to the next new thing very quickly.”

Cheap cloud + open source = a great time for startups

[soundcloud url=”https://api.soundcloud.com/tracks/184974081?secret_token=s-DR28d” params=”color=ff5500&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false” width=”100%” height=”166″ iframe=”true” /]

While the rest of the world binges on IoT goodies from CES 2015, we thought we’d focus on (what else?) enterprise-grade infrastructure. This week’s guest, Steve Herrod was formerly CTO of VMware, and so knows a little something, something about that topic. Now he’s managing director of General Catalyst where he’s looking for the next VMwares of the world.

Listen to his take on the wild world of cybersecurity where it’s pretty clear that the battle has to move beyond fighting yesterday’s threats; why the Hortonworks IPO is so closely watched; and how today’s startups have an embarrassment of riches when it comes to available, inexpensive infrastructure. With [company]Amazon[/company] Web Services, Google Cloud Platform, [company]Microsoft[/company] Azure and other players offering credits to woo startups, a young company can get up to $100K ($120K in IBM’s case) of plumbing for free. Let me reiterate: Free.

That’s a huge opportunity — although skeptics may point to parallels between cut-price cloud and crack cocaine, but I digress.

In our segment, Derrick Harris talks about how artificial intelligence is finding its way into the chip modules that will control our connected cars and other things, and we hash out the sticky matter of cloud closures and other topics.

Have a listen to the first Structure show of 2015 and let us be the gazillionth to say Happy New Year!


Structure 2012: Steve Herrod - CTO and SVP of R&D, VMware

Structure 2012: Steve Herrod – CTO and SVP of R&D, VMware




Hosts: Barb Darrow and Derrick Harris.

Download This Episode

Subscribe in iTunes

The Structure Show RSS Feed


It’s all Docker containers and the cloud on the Structure Show

Mo’ money, mo’ data, mo’ cloud on the Structure Show

Why CoreOS went its own way on containers

More from Facebook on its new networking architecture 

Do you find OSS hard to deploy? Say hey to ZerotoDocker


The year in tech: Net neutrality, IoT grows up, Uber turns heads

As 2014 draws to a close, the tech world seems a little weary. It was a draining year if you were plugged into social media, with conflicts at home and overseas juxtaposed against the soaring wealth of the San Francisco Bay Area, home to an industry that has become one of the dominant forces in the world. As we inch closer to what will likely be the top of the Third Tech Boom-Bust Cycle since the web changed the world, technology has never been more present in our day-to-day lives, for better or worse.

But for all the conflict that marked the year in tech — a blatant power grab by the company that was actually voted “Worst Company in America,” the uneasiness that FCC Commissioner Tom Wheeler might finally reward his old buddies in the cable industry with favorable internet regulation, and a series of public-relations disasters by Uber that left a black mark on the next dominant tech company — there were plenty of bright spots, especially among the areas that Gigaom follows closely.

Big Data has turned into big money and the rise of deep learning and artificial intelligence could transform computing. The cloud is the norm, and the largest companies in tech are going all-in on cloud computing as new startups promise to make complex app development even simpler. The internet of things, a concept we have evangelized for years, went from buzzword-just-around-the-corner to the cornerstone of planning from tech companies big and small heading into 2015.

The king of the hill — Apple — unveiled what could be its next-generation product category breakthrough amid the growing popularity of wearable computers. Microsoft showed that it is at last ready to enter the mobile computing era with the refreshing emergence of Satya Nadella as its third-ever CEO. And Tesla proved that the electric car is alive and well, and just getting started as the vehicle of the 21st century.

I asked our writers to pick the most important, most notable, and most influential developments on their beats in 2014, and here’s what they came up with. We’re looking forward to the holiday break as much as the rest of you are, because 2015 promises to be a landmark year for the tech industry.

Thanks for reading Gigaom, and Happy Holidays.
— Tom Krazit, Executive Editor


Net neutrality twists and turns

Demonstrators protest outside the FCC as the commission is about to meet to receive public comment on proposed open Internet notice of proposed rulemaking and spectrum auctions May 15, 2014 at the FCC headquarters in Washington, DC.

Demonstrators protest outside the FCC as the commission is about to meet to receive public comment on proposed open Internet notice of proposed rulemaking and spectrum auctions May 15 at the FCC headquarters in Washington, DC.

Two events this summer suggested that things were about to get much, much worse for American internet users: the FCC signaled strongly that it would give the go-ahead to ISPs to create “fast lanes” for favored websites, and the business press predicted that telecom giant [company]Comcast[/company] faced smooth sailing in its quest to swallow its largest rival, [company]Time Warner Cable[/company].

Then something changed. A popular backlash, egged on by the likes of comedian John Oliver and fanned by four million internet comments, caused the political winds to shift. All of a sudden, President Barack Obama called to implement real “Title II” net neutrality, and the FCC abruptly cooled on both the fast lanes and the Comcast merger. Going into 2015, consumers face an unexpectedly positive outlook for faster internet and real broadband competition.
— Jeff John Roberts, Senior Writer


Surveillance fight ramps up

Google Surveillance 2
When it comes to online surveillance, the most significant developments of the year were probably the striking-down by Europe’s top court of the E.U. Data Retention Directive, and the aftermath of that decision. The court said the directive, which forced telecom providers to store metadata about their users’ communications for surveillance purposes, was incompatible with the rights to privacy and the protection of personal data. The U.K. responded with a barely debated “emergency law” (months after the court’s decision) that not only made it possible for the British surveillance regime to continue, but expanded it to take in communication over social networks and more. Sweden, too, doubled down on data retention, leading one ISP there to offer free VPN access to its customers. Australia is now also introducing data retention. Meanwhile, the United Nations has begun condemning the practice on human rights grounds.
— David Meyer, Senior Writer


The resurgence of T-Mobile

The Paramount Theater in Seattle played host to T-Mobile's Uncarrier 5.0 event in June.

The Paramount Theater in Seattle played host to T-Mobile’s Uncarrier 5.0 event in June.

If there were a telecom horoscope, 2014 would be the year of [company]T-Mobile[/company]. The carrier was the object of other carriers’ desires — with Sprint and Softbank as well as French ISP Iliad angling to buy the company — and it became a symbol for competition in the U.S., with regulators making it clear that they don’t want to see fewer than four nationwide operators.

T-Mobile, however, didn’t just sit idly while the industry fought over its future. It became a competitive force in its own right, as well a thorn in the sides of [company]AT&T[/company] and [company]Verizon[/company]. It launched several unique initiatives, such as an iPhone loaner program. T-Mo even killed — or at least maimed — one of the mobile industry’s sacred cows, announcing that in January 2015 it will start allowing customers to hold onto their unused data each month.

Those moves helped T-Mobile bring in 6.2 million new connections in the first nine months of the year, giving it a total of 52.9 million subscribers and putting it within spitting distance of overtaking Sprint. But most significantly, T-Mobile is changing the mobile industry as a whole. Two years ago, there was no difference between a two-year contract and a postpaid smartphone plan. But thanks to T-Mobile, all four major carriers are retreating from contracts and subsidies and charging customers lower rates because of it.
— Kevin Fitchard, Senior Writer

Apple’s wild year

Apple CEO Tim Cook announces the Apple Watch during an Apple special event.

Apple CEO Tim Cook announces the Apple Watch during an Apple special event.

[company]Apple[/company] is still the world’s most valuable company by market cap, and although it doesn’t sell the most smartphones, it makes the most money. But while Apple is a formidable force, cracks are starting to show. iPad sales are actually decreasing, Apple’s cloud services are still a mess — as evidenced by the embarrassing iCloud hacks — and Android devices just keep getting cheaper and better. But the biggest Apple story this year is actually next year’s story: In September, Apple revealed its vision of wearable computing in the form of the Apple Watch. We still don’t know exactly what it does or how it does it, but one thing’s for sure: it’s going to be a big story in 2015.
— Kif Leswing, Staff Writer

Wearable devices may go mainstream

Android Wear time dim

After hearing for some time how big the wearable device market will eventually be, 2014 gave us reasons to finally believe the future forecasts: The first signs of serious mainstream adoption emerged this year.

Google launched its Android Wear software platform in June and now there are a half-dozen watches that run on it, with more to come. A peek at the Google Play Store shows that the required Android Wear app for these watches has between 500,000 and a million downloads. Pebble sold 450,000 watches by mid-year and continued to improve its product with Android Wear notifications.

Health tracking devices fitting most budgets have also appeared en masse in retail stores. You can spend $450 for the traditional timepiece looking Withings Activité, pay $12 for a Pivotal Tracker or choose from other options between both prices. Even Apple is getting in on the new market, announcing its Apple Watch in September with a starting price of $350 when it arrives in early 2015.
— Kevin Tofel, Senior Writer

Nadella puts his stamp on Microsoft

Satya Nadella microsoft employees
The slow-motion departure of Steve Ballmer as Microsoft CEO was pre-announced in August 2013, but the angst-filled search for his successor lasted into 2014. Yup, it took Microsoft’s board six months to find the next CEO right on its own Redmond, Washington campus. Satya Nadella was named the company’s third-ever CEO in February 2014.

Nadella didn’t take long to make his presence felt. In March, he hosted the public debut of Microsoft’s Office for iPad, stressing the company’s plan to support its applications on all devices, even those that don’t run Windows, in effect decoupling Office from Windows.

Nadella also pushed the Azure cloud hard. In a nod to changing realities, the company even deleted “Windows” from its cloud branding so Windows Azure became Microsoft Azure. If anyone doubted that Microsoft Azure would compete head-on with Amazon Web Services in public cloud, they should be sure of it after Nadella’s first year at the helm.
— Barb Darrow, Senior Writer

The deep learning explosion

Google acquired the Jetpac team in August.

Google acquired the Jetpac team in August.

The groundwork was laid in 2013, but 2014 was the year of deep learning. There were acquisitions (including DeepMind, MadBits and Jetpac), startups (Skymind, Ersatz Labs, Enlitic, Butterfly Network and MetaMind), and of plenty of debate over whether deep learning is the next big thing or just a lot of buzz. Both are probably somewhat true, but what’s undeniable is the pace of change in the field now that some of the world’s largest companies are funding it — last year’s advances were quickly overshadowed, and breakthroughs came from all over the place, sometimes simultaneously.
— Derrick Harris, Senior Writer

The year of the container

Ben Golub, CEO of Docker

Ben Golub, CEO of Docker

If you’re an avid follower of cloud computing, you’ve probably heard about the container and how it can help developers craft applications more easily as well as simplify IT operations. San Francisco–based Docker has been at the forefront of popularizing the container, which is a type of virtualization technology that lets a single Linux operating system kernel run multiple applications without them impacting one another. What made Docker so popular (it landed $40 million in the fall and is supposedly valued at $400 million) is that it makes it simpler to move these virtual shells — each containing parts of an application — across multiple environments like different clouds or even bare-metal servers. While Docker got a lot of attention this year from big tech companies — Google, VMware and IBM are all supporters — one of its partners, CoreOS, recently decided to launch its own take on container technology, dubbed Rocket. Now Docker might have some major competition as CoreOS’s spin on container tech is generating buzz among cloud watchers.
— Jonathan Vanian, Staff Writer

A seismic shift

game of thrones
In the history of TV, there will be a special chapter for that fateful week in October of 2014 when the unthinkable happened: First, HBO declared that it was going to launch an online service for consumers without cable. The very next day, [company]CBS[/company] actually went ahead and launched its own online subscription service. And hours later, Univision revealed that it wants to launch such an online service as well. That week may well be the beginning of the great unbundling, or at the least the week during which TV execs admitted that they can’t keep doing business as usual in the face of a seismic shift in viewing patterns: Just weeks before HBO, Univision and CBS revealed their plans, [company]Netflix[/company] disclosed that its average subscriber already streams 90 minutes every single day.
— Janko Roettgers, Senior Writer

Uber comes of age

In this photo illustration the new smart phone taxi app 'Uber' shows how to select a pick up location next to a taxi lane on October 14 in Madrid. Spain then banned Uber in December.

In this photo illustration the new smart phone taxi app ‘Uber’ shows how to select a pick up location next to a taxi lane on October 14 in Madrid. Spain then banned Uber in December.

In 2014, Uber went big time. The ridesharing company raised an additional $2.4 billion in venture funding, shattering private company valuation records. It continued its international expansion and picked up the pace, moving into China and India, and expanding its footprint in Europe. It hired Obama’s former campaign manager, David Plouffe, to manage its own public image. Not all was hunky dory, though. Uber’s elaborate scheme to steal drivers from Lyft became public, along with its threats to dig into journalists’ personal lives. The legal battles have only increased, and Uber fielded lawsuits from cities ranging from Los Angeles to Portland. For better or worse, the company is shaping up to be the Google of this generation.
— Carmel DeAmicis, Staff Writer

First Look Media falters

Investigative reporter Glenn Greenwald speaks at a press conference after accepting the George Polk Award along side Laura Poitras, Ewan MacAskill and Barton Gellman, for National Security Reporting on April 11 n New York City.

Investigative reporter Glenn Greenwald speaks at a press conference after accepting the George Polk Award along side Laura Poitras, Ewan MacAskill and Barton Gellman, for National Security Reporting on April 11 n New York City.

This year has seen a number of fascinating media events, from BuzzFeed’s $50 million financing and Vice Media’s billion-dollar series of deals with old-media players to the recent bombshell news from Gawker founder Nick Denton that he has turned over control of his blog empire to a managing committee. But I think one of the biggest stories of the year has been the launch and subsequent stumbles of First Look Media, the new venture funded by [company]eBay[/company] billionaire Pierre Omidyar.

Although the launch of First Look was announced in late 2013, after the news was leaked to BuzzFeed, the site didn’t even have a name, and didn’t actually launch until well into 2014, with the introduction of a “magazine” called The Intercept, run by investigative blogger Glenn Greenwald.

At the time, Omidyar said The Intercept would be the first of a series of similar magazines run by different journalists, including one driven by former Rolling Stone political writer Matt Taibbi, called The Racket. Those plans soon hit a speed bump, however, as stories emerged of micromanagement by Omidyar’s executives, and Taibbi eventually left — followed by Intercept editor John Cook, who returned to Gawker after co-writing a piece about the issues at First Look.

The upheaval has led some to wonder whether the company will ever achieve the goals that Omidyar outlined when he announced that he was committing $250 million to it, and whether newer ventures such as former NPR editor Andy Carvin’s social-journalism project — called Reportedly — will be able to rely on the organization for continued support. But at least they are a sign that there is still life in Omidyar’s vision, even if it has been a bumpy ride so far.
— Mathew Ingram, Senior Writer

Ebook subscriptions take off

“Netflix for ebooks” actually started to look like a viable concept in 2014: [company]Amazon[/company] unveiled its ebook subscription service Kindle Unlimited in July. Meanwhile, startups Scribd and Oyster, which had both launched in 2013, expanded their collections in 2014, nabbing a couple big-5 publishers (HarperCollins, Simon & Schuster) that Amazon hasn’t been able to get. And Macmillan CEO John Sargent praised ebook subscription services as a potential way that publishers can challenge Amazon’s dominance in the ebook market. That means that when you use these services, you’ll actually be able to find some big books you want to read (or, in Scribd’s case, listen to — the service added audiobooks in November). It’s unlikely that all three of these services will survive, but for now they are competing against each other with various holiday deals, so it’s a good time to give one of them a try.
— Laura Owen, News Editor

Google acquires Nest and Dropcam

Nest Thermostat

I wanted to call Apple’s debut of HomeKit the most important news for the internet of things this year, but since its debut in June we haven’t seen products launch, and won’t until CES in 2015. Instead, I think the biggest news item was Google’s announced acquisition of Nest for $3.2 billion in January and subsequent acquisition of Dropcam for $550 million in June. The Nest deal put a spotlight on the market and convinced entrepreneurs, venture capitalists and big-name consumer brands that there was a there there in the smart home. From that point on, what had been a mishmash of standards and smaller products became the equivalent of big data — something that, suddenly, everyone needed a strategy for. This will mostly affect the consumer market — businesses are playing an entirely different game when it comes to the internet of things — but it had a huge impact.
— Stacey Higginbotham, Senior Writer

Tesla’s gigafactory takes shape

A recently raised spot of land in the Tahoe-Reno Industrial Center.

A recently raised spot of land in the Tahoe-Reno Industrial Center.

Tesla first started talking about the idea of building a massive battery factory at the end of 2013, but it wasn’t until 2014 that the company started to take the steps needed to make that crazy idea a reality. Is it really that crazy? Yes: Tesla’s “gigafactory,” which will produce batteries for its third car as well as for the power grid, could more than double the entire world’s lithium ion battery production.

At the start of the year Tesla raised $2 billion to help fund the factory. Later in the year, it secured Panasonic as a crucial partner. During the summer, Tesla CEO Elon Musk started playing up the search for a site — squeezing cities and states for as many incentives as he could get — and in the fall finally settled on Nevada, at a site just outside of Reno, which we scoped out. The factory deal could help transform the gambling backwater that is Reno into a high-tech manufacturing hub.
— Katie Fehrenbacher, Features Editor and Senior Writer

Virtual reality finally gets some love

An attendee wears an Oculus VR Inc. Rift Development Kit 2 headset to play a video game during the E3 Electronic Entertainment Expo in Los Angeles on  June 11.

An attendee wears an Oculus VR Inc. Rift Development Kit 2 headset to play a video game during the E3 Electronic Entertainment Expo in Los Angeles on June 11.

Am I the only one who finds it crazy that this virtual reality revival only began two years ago? That’s when Oculus launched its Kickstarter campaign, spurring an explosion of startups.

But it wasn’t until March of this year that the interest in virtual reality turned into a frenzy. That’s when Facebook acquired Oculus for $2 billion, suddenly redefining it as a field to which even the largest companies should pay attention. Samsung has since released its Gear VR, and Google has Cardboard. They won’t be the last to release headset options. Oculus’ Rift headset hasn’t even been released to consumers yet. That should give you a hint that it will continue to be a top headline for years to come.
— Signe Brewster, Staff Writer

Bitcoin goes boom and bust

Mark Karpeles (2nd R), president of MtGox bitcoin exchange speaks during a press conference in Tokyo on February 28, 2014. The troubled MtGox Bitcoin exchange filed for bankruptcy protection in Japan on February 28, with its chief executive saying it had lost nearly half a billion dollars worth of the digital currency in a possible theft.

Mark Karpeles (2nd R), president of MtGox bitcoin exchange speaks during a press conference in Tokyo on February 28, 2014. The troubled MtGox Bitcoin exchange filed for bankruptcy protection in Japan on February 28, with its chief executive saying it had lost nearly half a billion dollars worth of the digital currency in a possible theft.

Bitcoin started the year on a high — adoption was rising, and so was the price, peaking at $1,023 on January 26. Now a bitcoin is worth around $350. You can blame, in part, the long shadow that the fall and bankruptcy of the MtGox exchange in February has cast over the community. Leaked documents showed that more than 750,000 bitcoin belonging to users were lost, along with 100,000 belonging to the exchange. At the same time, Newsweek allegedly outed the founder of the cryptocurrency, a report which Dorian S. Nakamoto has vehemently denied. After its bumpy start, the price has continued declining as the blockchain, the underlying technology behind bitcoin, has started to gain traction among other industries like the internet of things. Expect to see more talk of the blockchain (and perhaps a little less bitcoin) in 2015.
— Biz Carson, Editorial Assistant

Where is enterprise infrastructure headed in 2015?

The enterprise industry is another year older … and hopefully somewhat wiser. Here’s what enterprise watchers should expect to see in 2015.

More cyber attacks

Sadly, this is an easy one. As bad as 2014 has been, and it has been bad, we’re just seeing the tip of the iceberg. Given the steady increase in value going through our systems (credit cards numbers, personal information, IP), organized crime and nation-sponsored attacks will continue to rise in quantity and sophistication. The current approaches to security clearly aren’t cutting it, which is why the security space is one of my biggest personal focus areas. (Full disclosure I’m an investor in Illumio, Menlo Security and ThreatStream, all companies in this space.)

AWS pushes further into the enterprise

Almost every startup that we fund today is using [company]Amazon[/company] Web Services, but it’s interesting to see AWS creep further into the medium and larger companies that dominate IT spending. At this year’s AWS Re:Invent, there were lots of compelling enterprise anecdotes, plenty of “all-in” stories, and, most importantly, the arrival of an ecosystem. There were startups and large companies alike announcing integrations with AWS with particular focus on the “-ities” — predictability, manageability, security, and availability. These are good signs of increased adoption in mainstream businesses where it’s now not “if” but “when” a company adopts cloud.

AWS: Reinvent

The rise of IaaS competitors

AWS continues with its strong lead, but 2014 also showed that there’s going to be a bigger fight than ever. In particular, [company]Google[/company] Compute Engine and [company]Microsoft[/company] Azure are rapidly improving their services and have the pocketbooks to fight this for the long term. Throw in Rackspace, IBM, vCloud Air, HP and the many other regional or vertically oriented offerings, and it’s going to be a major battle — with customers as the likely winners.

Containers get down to work

The rise of Docker has been one of the true success stories of 2014. However, it has also created a deluge of competitors (CoreOSRed HatUbuntu) and interesting co-opters (AmazonGoogle and VMware). Quite a year for a previously unheralded technology. The rise is real, but I believe that some of the hype will subside in 2015 as the some of the real work of making containers usable by enterprises begins in earnest.

Docker and the money container

Converged/hyper-converged infrastructure grabs the limelight

2014 saw continued excitement over “converged infrastructure,” pre-configured hardware/software bundles that are powerful and easy to adopt. Nutanix, VCE and Cisco UCS get most of the attention, but there’s lots of interesting competition on the way, especially as the latter two vendors update their relationship status to “It’s complicated.” Latest offerings include VMware’s EVO designs to new products from the big system vendors (Dell and HP are particularly aggressive). And I can personally attest to a slew of startups heading into this converged world with a variety of technologies and approaches.

APIs on the mind

I wrote about “mobile-first infrastructure” earlier this year and continue to think it will drive several longer-term infrastructure changes. In 2015, I think it will manifest itself most as the rise of APIs in enterprise development, as companies both produce and consume APIs like never before. Look for increased conversations, companies and challenges arising over this shift. (Full disclosure: I’m a backer of RunScope, which makes developer tools for this “API economy.”)

Network virtualization gets its legs

There has been much discussion of the arrival of software-defined networking (SDN). However, the term itself has been polluted to a point where it means different things to almost anyone you ask. I prefer the term network virtualization to speak more holistically about the advancement in separating out the logical network from the physical network. Cisco ACI and VMware NSX appear to have the lead, and 2014 saw significant movement from proof-of-concepts toward significant paid usage. Anecdotally, most of the adoption is in service providers, financial services and tech-heavy IT companies. 2015 should see further progress in the adoption, including by a broader set of consumers.

Big deals for big data

In 2014 there was nonstop talk about big data, analytics, and the opportunities and challenges of each. 2014 funding for companies has been unprecedented, ranging from Intel’s huge bet on Cloudera to substantial private investments in DataStax (driver of Cassandra)Databricks (driver of Spark)PlatforaAltiScaleDataGravity and numerous others. (My company, General Catalyst, invested in AltiScale and DataGravity.) Next year these companies will all focus on revenue — and we’ll see how the public markets respond to at least one Hadoop vendor, as Hortonworks is now a public company.

That’s one person’s cut at developments in enterprise infrastructure for 2015, and I’m sure I’ve omitted others that will be even bigger. That’s what is so fun about this space these days: We’re in a modern-day renaissance driven by the convergence of new technologies, new expectations and new challenges, all of which point toward more and bigger changes happening each year than may have taken place in prior decades. Here’s to the fun ride ahead.

Dr. Steve Herrod is a managing director at General Catalyst Partners and was CTO and senior vice president of R&D at VMware.

Note: This story was updated at 5:24 p.m. PST to correct the reference to Cisco ACI (application-centric infrastructure) not ACE.