Docker is an application build and deployment tool. It is based on the idea of that you can package your code with dependencies into a deployable unit called a container. Containers have been around for quite some time.  Some say they were developed by Sun Microsystems and released as part of Solaris 10 in 2005 as Zones, others stating BSD Jails were the first container technology. For a visual explanation, think of the shipping containers used for intermodal shipping. You put your package (code and dependencies) in the container, ship it using a boat or train (laptop or cloud) and when it arrives at its destination it works (runs) just like it did when it was shipped.

Before an industry standard container existed, loading and unloading cargo ships was extremely time consuming. All of this was due to the class of the materials being shipped: They were different shapes and sizes, so they couldn’t be stacked or moved in an efficient manner. Everything was custom, and this is called break-bulk shipping. The industry felt the pinch of high labor costs, waste of valuable shipping space, and no commonality between shipping companies and transportation vehicles.

Once a standard size and shape were developed, it revolutionized the shipping industry. The agreement on the containers specification decreased the shipping time, reduced cost and reinvented the way companies did business.

How will Docker impact us today?
Docker creates an isolated Linux process using software fences. For now, Docker is 100% command line (CLI) (update – not anymore, there are a few GUIs including Docker’s own Enterprise Edition). Certainly, launching Linux containers from the command line is nothing innovative. Nevertheless, the containers themselves are only a small part of the story.

The other part of the puzzle are images. Images are an artifact, essentially a snapshot of the contents a container is meant to run. As a developer, when you make a change to your code a new version of the image (actually a new layer) is automatically created and assigned a hash ID. Versioning between development, test and production are quick, seamless and predictable. Many longstanding problems in managing software systems are solved by Docker:

  • Management of applications: two applications that rely on different versions of the same dependency (like Java) can be easily coexist on the same operating system
  • Version control: Images are created using a text file (Dockerfile), so every previous image, and therefore container deployment is retrievable and re-buildable
  • Distributed management: a GitHub like repository to manage organization of images and deployment of application containers (called Docker Enterprise Edition, containing Docker Universal Control Plane and Docker Trusted Registry)
  • Low overhead: Unlike virtual machine hypervisors, Docker is lightweight and very fast, containers are small and boot instantly

One of the first things I noticed was how quickly you can make a change to an application and (re)deploy.  By using Docker, companies that need to scale for mobile or streaming applications the technology will do the same for their business as it did for the shipping industry. Seeing this first-hand at Nebulaworks, we coined the phrase “Application Logistics” as we’re moving apps quickly in and out of production as containers, across on premise and cloud infrastructure.