The best definition of Docker I’ve heard came from my friend, Brandon Butler, over at Network World.
“Docker is both an open source project and the name of a startup that focuses on Linux Containers. Containers are the idea of running multiple applications on a single host. It’s similar to compute virtualization, but instead of virtualizing a server to create multiple operating systems, containers offer a more lightweight alternative by essentially virtualizing the operating system, allowing multiple workloads to run on a single host.”
Everyone is high on Docker this week, with the just released the 1.0 version of its product. In conjunction with the release, Docker is hosting an event named DockerCon. The stats are pretty impressive for a company and technology most were not tracking last year. The Docker project has been downloaded (for free) more than 2.75 million times and more than 460 contributors helped create this version. What’s more, technology companies are jumping on the Docker bandwagon, with Red Hat and Google leading the charge.
Indeed, Google open-sourced a Docker-centric tool called Kubernetes (announced this week) that lets its cloud computing customers automate their resource management, which is much the same as Google does it internally, and has for many years. Google sees that Docker’s approach to application deployment and workload management has good synergies with their existing public cloud offering, and I suspect that they will get some play out of the use of Docker.
Of course, IBM is joining the Docker party as well. IBM said this week that the company is working with Docker to further integrate both technology platforms. It also said it would host DockerHub, which provides a range of services for distributed applications deployed with Docker on the SoftLayer platform.
Count on other public cloud providers giving Docker love, if they have not done so already.
Selling lightweight cloud application portability
The value that Docker brings to cloud-based application development is the use of containers. Of course, I’ll give you the same “old computer dude” response that the use of containers are nothing new. However, this pattern of application architecture has great promise. Cloud-based platform are widely distributed, and the need to move workloads from resource to resource means that lighter weight approaches are much better fits.
As depicted in Figure 1, Docker differs from virtual machines in that Docker does not leverage a hypervisor, nor does it need operating systems located with the application in a VM. The virtualization approach is much heavier, typically 10s of GBs, and thus difficult to move from cloud platform to cloud platform.
Figure 1: Docker is lighter weight than virtual machines, as well as simpler to deploy (Source: Docker).
What’s most interesting here is that there are fewer boxes on the Docker side of things, thus the architecture is much simpler, as well as much lighter weight than traditional virtualization. Because Docker removes the need to collocate an operating system with the application, within a separate VM, Docker-enabled applications are much easier move from cloud platform to cloud platform. This provides the ability to build applications as sets of Docker containers that can be moved around from cloud to cloud, on-premise to cloud, or whatever you need, without a great deal of work. In essence, they become distributed objects, which are able to operate within any Docker-compatible platform.
Docker the disruptor
So, now that we know what Docker is and does, how can it be disruptive?
- First, many are calling for the death of virtualization, now that Docker is picking up in popularity.
- Second, we’re now questioning approaches to cloud application portability that have been shoved in our faces over the last several years.
- Finally, cloud providers are quickly scrambling to figure out how Docker can either help, or hurt them.
While Docker is clearly an option to virtualization, most don’t use virtualization as an approach to application portability. Indeed, it’s more of a resource utilization and workload management type of technology.
However, virtualized workloads have been sold as an approach to cloud-to-cloud portability, with the ability to move VMs from private-to-public, or public-to-public cloud providers, allowing cloud users to run application workloads dynamically on many different cloud platforms.
This is where Docker is a disruptive force. It may offer a better way to distribute application workloads than traditional virtualization. It’s certainly lighter weight, and seems to be much more cost effective. In my opinion, many cloud technology companies that sell virtualization technology have to consider Docker a threat.
What’s more, many IaaS and PaaS cloud providers have been pushing more proprietary cloud portability approaches for years now, and this also includes virtualization, abstraction, and even proprietary containers. To them, Docker will be hugely disruptive because Docker is now widely supported and free, and thus should provide the lower risk option for those looking at cloud-to-cloud portability.
The question being asked right now by those in cloud tech is: Battle on with proprietary technology, embrace Docker, or do both? The appearance of Docker is not only game changing, but could quickly kill emerging business, much like other open source plays have replaced proprietary technologies in the past.
The same theme exists in the wall of the major cloud providers. Most have quickly embraced Docker, understanding that it has the hearts and minds of most cloud developers, as well as those using clouds. However, the long term impact on their business will vary, depending upon what they are offering now, and how well that meshes will the use of Docker.
As a rule, the larger cloud providers will tend to benefit from Docker, given that they can just add Docker as another service in their services catalog. So, AWS, Google, Red Hat, and a few others will benefit a great deal from the adoption of Docker. Those that are based on virtualization as a core technology will find that Docker is somewhat deficient, considering the points made above.
The reality is that things such as Docker will come down the line in the world of cloud computing, which seem to bring simple and obvious mechanisms for use in a rather complex world. Docker is important because it’s a common framework of understanding as to the best way to deploy applications that are portable, lightweight, and efficient.
However, what’s good news for the cloud computing community could be bad news for some cloud providers, and those who build and sell cloud technology, if they have already gone down the this path with their own approaches to cloud application architecture and approaches to portability. For those guys, Docker could be the “death of a thousand cuts,” and they will need to do some quick thinking to stay in the game.