Jefferson Kenji Takahashi

Jefferson Kenji Takahashi


Today’s most dynamic enterprises continue to quickly transform their business operations through software innovation — and containers have proved to be an increasingly must-do strategy. The momentum has been felt all across the developer community and IT space as container security solutions optimize the performance of data center resources and hasten application development and delivery. According to the 2018 Docker Usage Report by the container monitoring company Sysdig, modern enterprises are packing in 50% more containers per host this year compared to last year.

Docker is a computer program that performs operating-system-level virtualization, also known as “containerization”. It was first released in 2013 and is developed by Docker, Inc.

Docker is used to run software packages called “containers”. Containers are isolated from each other and bundle their own tools, libraries and configuration files; they can communicate with each other through well-defined channels. All containers are run by a single operating system kernel and are thus more lightweight than virtual machines. Containers are created from “images” that specify their precise contents. Images are often created by combining and modifying standard images downloaded from public repositories.

Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow a developer to package up an application with all of the parts it needs, such as libraries and other dependencies, and ship it all out as one package. By doing so, thanks to the container, the developer can rest assured that the application will run on any other Linux machine regardless of any customized settings that machine might have that could differ from the machine used for writing and testing the code.

In a way, Docker is a bit like a virtual machine. But unlike a virtual machine, rather than creating a whole virtual operating system, Docker allows applications to use the same Linux kernel as the system that they’re running on and only requires applications be shipped with things not already running on the host computer. This gives a significant performance boost and reduces the size of the application.

Common Use Cases

Now that you have an idea of what Docker container are, it’s important to understand the potential use cases of it.

For example Paypal uses Docker to drive “cost efficiency and enterprise-grade security” for its infrastructure. Paypal runs VMs and containers side-by-side, and says that containers reduce the number of VMs it needs to run.

Application development: Docker is primarily used to package an application’s code and its dependencies. The same container can be shared from Dev to QA and later to IT, thus bringing portability to the development pipeline.

Running microservices applications: Docker lets you run each microservice that makes up an application in its own container. In this way, it enables a distributed architecture.

Advantages of Docker Containers

Docker containers are process-isolated, and don’t require a hardware hypervisor. This means Docker containers are much smaller and require far fewer resources than a VM.

Docker is fast. Very fast. While a VM can take a at least a few minutes to boot and be dev-ready, it takes anywhere from a few milliseconds to (at most) a few seconds to start up a Docker container from a container image.

Containers can be shared across multiple team members, bringing much-needed portability across the development pipeline. This reduces ‘works on my machine’ errors that plague DevOps teams.

The road ahead is quickly changing for VMs at least, as Docker is quickly rising in popularity among major corporations. It’s clear that speed and efficiency are the biggest needs for a DevOps teams, and Docker is better at providing these over VMs. While it still hasn’t completely replaced virtual machines in production environments, Docker’s potential is being noticed. That isn’t to say that VMs will soon be gone. Rather, Docker and VMs will co-exist next to each other, giving DevOps teams more choices in how to run their cloud-native applications.