Allen T.V>
Posted on June 8, 2019
Local development involves setting up and experimenting with a lot of tools to reach a point where the app is ready to be developed. Experimenting with the right tools takes time and also runs the risk of polluting the global space of your machine which in turn can cause other software to stop working because a shared library was updated by another. A similar situation called "DLL hell" is quite well-known in the Microsoft World. Containerization is an excellent solution to get around this problem with very little downside.
Docker is a well known player in the container domain and has world wide domination in terms of mind space when someone thinks about containers. The rise of DevOps culture has played a massive role with tools like Kubernetes gaining wide spread adoption. There is also a huge demand for engineers who understand how Docker works. So investing time to learn Docker is totally worthwhile.
To improve the local development experience, install Docker from https://www.docker.com/products/docker-desktop specific to your environment. The next step is to create a Docker file that defines the custom environment and tools required by your application. This step will take some time as we it requires trial and error to build an optimised image. After the image has been built, you can now spawn docker containers which is the runtime for your application. The container can load resources from your machine inside your container very similar to mounting an external resource inside a Virtual Machine (VM). Fine tuned options like controlling the hardware resources available for the container like CPUs and Memory gives the extra that experienced engineers are interested in. Such options also make a lot of sense especially when scaling of the app is required and is managed automatically on a cluster by something like Kubernetes. Clearly defining hardware resources also goes a long way in managing hardware utilization and especially on the wallet since better utilization would translate to lower operating costs.
The key advantage that docker containers provide is isolation of resources and the ability to easily recover if there is a crash. The container is managed by the OS through a process and on a crash, only the process is killed without having an impact on the rest of the system. This is very handy when dealing with migrating legacy software systems to newer versions and there is an expectation from business stakeholders that both systems run in parallel till the legacy version is retired. Having isolation between 2 versions and the ability to monitor how things work in parallel is very useful to the success of large software migration project.
If you think this is too much work, it is not. The whole docker setup takes less than 5 minutes. You can find pre-built docker images on docker hub which is a public registry for anyone to host docker images that they created. Docker file management and stringing of multiple containers can become very hard to maintain. In such cases, you can use docker compose tool to manage a multi-tier stack. The community around docker is very supportive and with local meetups/events happening all around the world, it is a good idea to get started today!
Posted on June 8, 2019
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.