Is Docker production-ready?

According to Avi Cavale, Docker not only is ready, his company already runs thousands of Docker containers in production per week. Cavale is the co-founder and CEO of Shippable, a containerized continuous integration (CI) platform, and he says that Docker provides opportunities to radically accelerate how DevOps is "re-engineering the corporation" for IT.

In this interview in our ApacheCon North America series, Cavale looks at how Docker fits into DevOps. He also explains how continuous integration and continuous development are changing the way developers work, and why Docker is a particularly good fit for CI/CD applications.

For those not familiar with continuous integration/continuous development, how would you describe it? How are these processes changing the way developers are creating and developing software?

Continuous Integration is a development practice that requires developers to integrate code into a shared repository several times a day. Each check-in is then verified by an automated build, allowing teams to detect problems early. Continuous Delivery (CD) is a software engineering approach in which teams keep producing valuable software in short cycles and ensure that the software can be reliably released at any time. It is used in software development to automate and improve the process of software delivery. Automating these processes improves the efficiency of software development and deployment, helping companies to reduce costs and prevent errors.

The most significant changes we’ve seen in how these automated processes have changed the way developers are creating and developing software are:

Developers develop and release in smaller increments since the overhead of merging code into the master branch is almost eliminated with automated CI/CD triggered on every check-in.

Less time is spent trying to determine whose code "broke the build" as each individual check-in is evaluated immediately. As a result, teams have eliminated the practice of running a single integration build once per week and then performing time-consuming investigations to debug which code introduced errors.

Debugging takes less time as developers receive immediate feedback on code they just worked on, while the details are fresh in their minds. Whenever a delay occurs in discovering bugs, debugging takes longer as important details may have been forgotten and the developer must ramp-up on the code again.

What makes Docker a particularly good fit for CI/CD applications?

CI/CD applications are a particularly good fit for Docker for several reasons:

Docker is great for transient workloads, since you can spin up and shut down a container in seconds—a fraction of the time of provisioning a VM. This makes Docker tailor-made for CI processes such as build/CI validation cycles that initiate each time code is checked in. With Docker, each of these cycles can be initiated on-demand in its own container, executed, and then have the container shut down, freeing the resources. This proves to be a much more efficient model than using VMs for this purpose (Shippable reduced its Dev/Test environment costs by 70 percent when converting from VMs to containers).

Within a container, all details of an application live together—i.e. the code and the environment in which it will run—making it easy to fully replicate the environment that a developer’s code will run in. This solves the “but, it worked on my machine” problem where code worked on a developer’s local machine, but didn’t work in one of the subsequent environments, i.e. Test or Prod.

Once an application image is defined, it can easily be stored and reused anywhere that has Docker installed, making it easy to eliminate time-consuming tasks developers often go through to create their environments locally. These images can also be picked up by IT operations for use in deployment activities.

A good CI/CD system will also enable caching of those images, so spinning up an environment for a developer can happen quickly and painlessly for the developer.

Do you see Docker replacing traditional virtualization in the CI/CD space, or is it more about adding new capabilities and speeding up the process?

VMs won’t be eliminated in the short-term, but in the long-term we see container-based virtualization fully replacing VMs.

What role do Apache projects play at Shippable? And how are you and/or Shippable involved with the Apache Foundation?

Shippable has only recently begun getting involved with Apache. Of primary interest for us currently is the Kafka project. We’ve begun evaluating it in an internal POC to see if it is a viable replacement for RabbitMQ, which we run for our messaging service. The driver for this is Kafka’s scalability.

Without giving too much away, what can attendees expect to learn in your ApacheCon talk, "Modern DevOps with Docker in 2015?"

Docker is ready for production—Shippable runs more than 25 thousand containers in production per week

Key elements of the DevOps lifecycle completely transform with the use of containers allowing for significant re-engineering of the lifecycle. DevOps is "re-engineering the corporation" for IT, and Docker provides opportunities to radically accelerate this.

With modern DevOps, you should be deploying to production multiple times a day. We’ll explain how Shippable deploys 10-40 times per day.

To learn more about modern devops with Docker, attend Cavale's talk at ApacheCon in Austin.

ApacheCon 2015

Speaker Interview

This article is part of the Speaker Interview Series for ApacheCon 2015. ApacheCon North America brings together the open source community to learn about the technologies and projects driving the future of open source and more. The conference takes place in Austin, TX from April 13-16, 2015.