When I first started learning about Docker few months back, I had just got transferred from my companies AWS team and to be honest I was a little skeptical about technology but thankfully, my manager was an expert in the field and I attended one of his trainings to get an overview of Docker. Soon, we started containerizing our AWS environment with Amazon ECS. So, I finally surrendered to containers and found that they solve a lot of real world problems and have a lot of advantages.
I think this is a good place to start. Over this series of blog posts I’ll be covering ways to use Docker, Importance of Docker, real life use cases in solving problem while moving to production, etc.
The buzzword Docker is the most talked about in today’s IT world. If I get down to the nuts and bolts, Docker allows a number of advantages, which allow you to run multiple containers to run on single host and are light weight than virtual machines. In today’s post, we will discuss why is Docker a game changer and why you should go with Docker. First we will see some of the advantages of docker and then we will discuss on some day-to-day use cases.
Availability in Multi-Cloud Platforms
One of the reason behind getting a huge no of user for Docker is portability. All major Cloud Computing providers like Amazon Web Services (AWS), Google Compute Platform (GCP) and Microsoft’s Azure have embraced Docker’s availability and added their individual support on it. You can run a Docker container inside Amazon EC2 instance, Google Compute Engine instance, Rackspace server or VirtualBox where the host OS supports Docker. With this availability you can select the platform based on scenario.
Easy to maintain versions
Docker allows you to make changes in the container and commit those changes to your Docker images and version control them just like you do in GIT repositories. Now, let’s say you make some component upgrade, which breaks your whole environment. In this case, it is very easy to rollback to a previous version using your Docker image.
Docker images supports incremental updates, which saves time as well as memory in your disk while downloading or updating the images.
Docker allows you to allocate resources like specific amount of CPU, memory and disk resources to each container. With this, you will get the same benefit offered by virtual machines, where you can carve up a computer into smaller chunks of resources so you can create multiple containers that can act as micro services. It is not mandatory to allocate fixed amount of resources for each container; you can also create multiple containers that will share the host machine resources.
Docker ensures the isolation of resources and applications. A report published by Gartner stated “Docker containers are as good as VM hypervisors when it comes to isolation of resources”.
Let’s say you are running a web application with Apache Tomcat in a host server. Later you want to run another application which requires a different version of Apache Tomcat. To fix this, either you can move your current application to existing version of Tomcat or you can move your existing application to current Tomcat version.
With Docker you don’t have to do this much of work. Docker containers are isolated from each other and can have their own set of resources to perform the tasks independently.
The code needs to travel various environments to reach production. Each of these environments may have minor differences. With the immutable nature of Docker image, it is easy to deploy the application on several environments. For someone who wants to update any application, one can make the changes in a running container, commit the changes and deploy the same in all the environments within couple of minutes.
Docker gives you the ability to bundle and ship your application with all the required packages, binaries and dependencies from one environment to another.
Docker Image references a list of read-only layers. These layers are stacked on top of each other to create a base for the container’s root file system. Stacking up these layers are done by the storage driver, when you create a container form an image it creates a writable layer on top of image’s layered stack and gives to you for further changes. You can make changes only in this writable layer. When you commit the container you will get a new layered stack(Image) with the writable layer of your container where the other layers are still pointing to the base image, which cannot be deleted.
Now, if you delete a running container, it will only delete the writable layer not the whole base image. This helps engineers to upload and download the images in the fastest way. While uploading an image, it checks for the layers in the repository, if any layer is available, it creates a reference of that existing layer instead of uploading a new copy. It only uploads the layers that are not available in the repository. The same process happen when you download any image in your machine.
At this point, you know some of the advantages of Docker. Now, lets discuss about some day-to-day use cases.
We all need a productive development environment for our regular deployment and testing, and we wish that the application will also work in test environment tomorrow. Configure your complex environment with Docker container and use the same container again and again.
We always want the development environment to be as close as possible to production, for that we need to have every service running on its own VM as it is running in production. However, we don’t want to always require an Internet connection and add in the overhead of working remotely every time a compilation is needed. Now here Docker comes in handy. Usually a development environment has low memory, you have to add additional memory to run multiple services in case of VM. But Docker allows to run a dozen of services within Docker.
Now, what about the testing environment? You will always love to do the testing on a fresh environment instead of a hazy one that you already used for development, and you have to wait to get the number of resources from your office system admin, then you will setup the environment and start testing. One thing you should remember “The more time it takes to run the tests, the less testing will be done“. With Docker you can setup an environment within a couple of minutes and you can use the same environment (i.e the container) again and again with Docker images.
Docker has a community which is growing at a very fast phase, the involvement of people in adding new features and eliminating bugs is bursting. What are you waiting for? Start Dockerizing your applications!
Don’t know where to start? We are here to help. Check out course on Docker Essentials, which will give you a better understanding on core concepts of containers, containerization of applications, hands-on knowledge about moving applications to Docker containers and architecting highly available and scalable applications deployed on Dockers.
Please comment and share if you like the article. Feel free to ask your questions and provide your valuable suggestions below.