Hello, welcome to the first in a several part series on Docker. I’m expressly interested in this topic for a number of reasons but, if I had to pick the prime reason, it would be this – often in information systems, the immediate financial interests of IT management are at odds with the desire of developers to pick up new tools or technologies. With Docker, everyone wins.

Throughout this series, I will be exploring Docker – the importance of containerization, how to technically work with it, and what it means for .NET developers and those looking to take advantage of dev(ops) in Microsoft Azure. Docker is platform agnostic and can be used across different languages or different operating systems, but my personal day-to-day modus operandi is mostly in the Microsoft space and I therefore will be zeroing in on it from that perspective.

Why Docker?

So before diving into the employment of the tool, I’d like to stress the importance of why one would be interested in Docker – or containerization – at all. I daily enjoy the opportunity of working with IT professionals, developers and technical sales professionals and the nuances of benefit can vary slightly from role to role but the general play is essentially the same: our customers can, at little-to-zero licensing cost, implement containerization for applications and maximize computer-based resources while improving deployments thereby saving on personnel and operational costs. Looking only at compute resources, review the following statistics*:

  • 25% savings in compute power
  • 7% savings in system memory (RAM)
  • 30-35% savings in storage

*as presented at DockerCon17. Test completed by Docker, HPE and Bret Fisher using modern HP ProLiant servers and VMWare ESXi 6 virtual machines.

On-premise, or in the cloud, organizations can minimize their computer operational costs. There are some other inherent benefits that containerization offers when compared to setting up additional physical servers or virtual machines:

  • Leverage existing skill sets of your developers as apps live “side-by-side”
  • Fewer operating systems and / or hypervisors to manage
  • Application version control with rolling updates
  • Application isolation and built in security by default – encryption at rest and TLS
  • Speed to application delivery

I mentioned previously that this is a free (or low cost) solution. While it isn’t the purpose of this article to cover the details around licensing, it is worth noting that – at the time of this writing – one can grab the community edition at no cost. Additionally, Windows Server 2016 comes with the enterprise version included in the operating system. I will cover Azure licensing in a future segment.

For developers, Docker promotes disciplined architecture practices and brings efficiencies throughout the software development lifecycle. I’m going to discuss the process of migrating existing applications to containers later as well but, the short story is that (1) greenfield applications are really easy to start in Docker and (2) existing applications can be migrated as well although the process I would typically take with a customer is the same for Azure PaaS-based migrations – refactor to take advantage of the platform benefits as time permits.

What is containerization?

“Why Docker” explained that this is a technology that can benefit everyone in the IT space. But, what exactly is containerization?


In the enterprise, one might have a need to deploy multiple, disparate applications on a single computer and those applications may need to be isolated. This could be true for a variety of reasons – two different operating systems are required, the application may conflict with other applications or have conflicting dependencies. This happens in production scenarios, however, workloads in the software development life cycle may dictate this throughout the entire development chain. Without containerization, the scenario could be to create new virtual machines, allocate CPU, RAM, etc. load the OS and prerequisite software and deploy the application (or clone an existing VM). Virtualization typically comes with some time penalties up front and requires resources for running the hypervisor and virtual machine(s).

With containerization, the Docker Engine and application image(s) are loaded on the Docker Host (any computer that hosts Docker containers). Those images contain the application and any direct dependencies required for the application to run. Unlike virtualization, though, these applications share the same kernel and thus do not have the resource requirements of virtual machines. In Docker, these images can be copied from computer to computer through a variety of methods – the use of a repository, scripting commands, through Compose – and images can be updated through the use of zero-downtime rolling updates. Because of the package sizes, creating and cloning images is extremely quick (typically sub-second) when compared to virtual machines.

Although virtualization and containerization are different, they are not necessarily mutually exclusive. A fully supported scenario combining both technologies starts with the creation of virtual machines using Hyper-V, loading the Docker software and then deploying image files to the virtual server(s). In the Docker space, multi-machine, multi-container applications clusters are referred to as a “Swarm” (Quick side-bar, it only bothers me slightly that they didn’t choose a more appropriate name for these clusters such as “school” or “gam” given the Docker chose a whale for the logo).

Docker in Azure

For Azure customers, deployment and management is extremely easy. Docker for Azure provisions a TLS-secured Docker cluster and takes advantage of a number of Azure capabilities – including orchestration and diagnostics, autoscaling and load balancing – all behind the familiar Azure management user interface. As a consumer of containers in Azure, users can quickly grab images from Azure Container Registry or Docker Hub. Whether running Linux or Windows, Docker in Azure supports countless deployment, management and code options to meet IT professionals and developers wherever they may be. In a future segment, I will dive deeper into the specifics of Docker running in Azure.


In summary, whether you are a modern application developer writing microservices or an IT infrastructure manager – Docker is worth a serious look. In future segments, I will be focusing on (1) Getting Started with Docker, (2) Docker in Azure and (3) Docker for the .NET developer. I cannot make any promises that I won’t find other detours to take along the way.

At Planet Technologies, we include envisioning and implementation consulting around Docker as a part of our “Developing Cloud-Based Applications” engagement or via development support services.

– Mark

(cross-posted at https://syntacticsugarweb.wordpress.com)


Hits: 98