On the week of March 23rd, Moby Dick of the container world Docker enjoyed a series of second birthday parties, with a series of charity ‘open-source-a-thons‘ taking place around the world. And with 2336% growth between June and November of last year alone, there was certainly a lot to celebrate. In this interview, Docker COO Steve Francia takes us on a birthday retrospective. We discuss the primary goals for Docker and how it filled a hole in the sector nobody had even defined, and where the project will go in the future.
Voxxed: What do you have been the three most pivotal moments in Docker’s journey over the past two years?
Francia: If you look at our progress over the last two years, the first key milestone is inarguably Solomon’s presentation and Docker’s launch at Pycon. When Solomon introduced the notion of Docker containers at the conference, there was an immediate and tangible response. While the idea behind containerization wasn’t new, Docker presented a particular approach centered on developer experience and portability that resonated with developers. The fact that a “dockerized” app could work the same way on a laptop, on staging, on a testing VM in the data center, and in a production environment in the cloud – was a demonstration of true portability that was groundbreaking for developers. Developers wanted a simplified set of tooling that worked for all phases of the development lifecycle. Docker became a catalyst for a new way to handle distributed applications.
The second major milestone was the 1.0 release of Docker. It marked the first “production-ready” version of our software. Up to this point, many organizations had ignored our “do not run in production” warnings so it was the first time the community had this level of product maturity from Docker. This release represented a level of quality, feature completeness, backward compatibility and API stability that met enterprise IT standards. Additionally, it was the first time we had provided a full solution for using Docker in production with complete documentation, training programs, professional services, and enterprise support.
The third biggest milestone was when our go-to-market partners – some of the largest software, software services, and cloud companies – began to make strategic roadmap investments with Docker in Q4, 2014. This was a testament to the Docker project and the community that has made Docker the standard to build, ship, and run distributed applications.
What do you hope to see in the Docker ecosystem by its third birthday?
Docker caught the attention of developers by providing a vision of what the future of could be. By our third birthday, I would like to see Docker deliver on that promise. We started out with a mission of building tools of mass innovation and we are beginning to see the fruits of our effort. Today, millions of developers are using Docker, tens of thousands are building tools around it and hundreds of people are contributing directly to its code base. We are committed to fixing fragmentation for users and to adding tools that make it easier to create distributed applications. In short, Docker is committed to serving the needs of the community through continued innovation.
Docker achieved ‘critical mass’ growth in the tech community at a hyper-rapid pace. What do you think drove this phenomenal uptake of the technology?
With the Internet there emerged a new way of running applications. Applications like Google and Facebook could no longer be run on a single machine but required the coordinated efforts of thousands of machines. Even small applications require the coordination of many machines.
I think the reason that it appears that Docker’s success happened so abruptly was that everyone in the developer community had a sense that something was missing, but they couldn’t put their finger on it. Docker provided a desperately needed solution to an entire industry by providing true portability, the tooling and the format to separate applications concerns from infrastructure, dramatically simplifying the developer workflow. While Docker is relatively new, it is composed of many well-established and stable technologies, which enabled Docker to become stable and trusted at a very rapid pace. These ingredients, in combination with Docker’s approach based on portability, made Docker the catalyst for a new way to handle distributed applications.
What big changes do you see Docker driving in the industry?
One of the biggest changes that Docker is enabling is the shift to distributed applications or microservices-based applications. Monolithic applications are being replaced by microservices architectures, which decompose large applications – with all the functionality built-in – into smaller, more manageable services. Docker containers enable microservices by creating a highly efficient distribution model that enables services to be deployed more frequently, instead of requiring synchronized deployments happening on a fixed time. This changes established development practices by putting larger-scale architectures within the reach of smaller development teams.
What are the most interesting use cases you’ve seen for Docker?
One of the most common uses cases that we have seen is accelerated deployment by enabling continuous delivery/continuous integration using Docker. Customers use Docker to simplify deployments, accelerate the process, and generally improve the quality and robustness of application delivery. We’ve seen customers reduce code deployments from 9 month to less than 15 minutes. Organizations transform with massive improvements in productivity, as they feel unshackled by long development cycles and precarious deployments.
How big is the Docker community now?
The Docker community is growing at a rapid pace. We estimate that we have somewhere between three and four million users in the community with more than 300 million downloads of our technology to date. There are also more than 30,000 tools in the Docker ecosystem and more than 100,000 dockerized application.
What are the questions are you most commonly asked about Docker by its user community?
One of the most common questions about Docker is still, “How does a container differ from a VM?” It’s easy to see why this question comes up as both VMs and containers provide a degree of isolation from other applications and processes running on the same machine, however that is where the similarities end. It interesting that we get this question two years into the project, because we continue to see many use cases where containers and VMs are completely complementary to VMs as opposed using one approach over the other. Docker provides a model for building, shipping and running distributed applications in containers that can then run on any infrastructure. It turns out that VMs, just like a developer’s laptop or a cloud, are also good infrastructure choices for Dockerized applications.