Integrating Docker with your Personal Projects

danku

Daniel McMahon

Posted on December 4, 2018

Integrating Docker with your Personal Projects

Why?

Docker themselves outline the use cases for adopting their container based infrastructure over on their main website. I can provide a brief snapshot of my own opinions on why it's a useful tool and how it can be used to optimize your project workflow as well as how it can make your projects stand out after just leaving college.

The Benefits

I'm going to avoid the ship shipping ships container metaphors here. If you've ever heard or come across Docker or Kubernetes the landscape is littered with these metaphors. I'll try keep things a little more simplistic in outlining some of the benefits of this tooling:

  • Dependency Validation: Docker allows you to create images based on a build of your project. This means all your dependencies are self contained, allowing you to build the project locally as a docker image, run it as a container and verify that it works and has all the correct dependencies. This prevents weird oversights you might have with globally installed packages on your local machine that aren't defined in the projects package.json, sbt or dep file. If the Docker image runs on your Windows PC, it should run fine on your Linux EC2 instance or other devices you plan to run the Docker image on.

  • Version Control: don't just rely on GitHub! As mentioned above you can build Docker Images which are essentially snapshots of your projects builds at various points in time. Imagine a scenario where you need to demo your service locally, you pull the latest version from master, build it and it crashes... you don't have time to revert to an older commit, build the service and then launch it... instead you can just pull the older commit image and run it - as its all self contained it will be much faster in building/running. It's also a handy way to quickly compare old features and new features -> it can also be an interesting way of viewing how a project originally looked 5+ years ago compared to how it is now.

  • CI/CD: If you're using continuous integration/development solutions and you accidentally deploy a broken build in a pipeline, it can be quite time efficient to simply refer to an earlier docker image rather than rebuild the project from scratch again. A popular method of development is to wrap up your projects as docker images, upload them to a cloud solution (such as the public Docker Hub or a more private solution hosted on a service like AWS Elastic Container Registry) and then use those images in the prod environment. This is a particularly powerful solution when paired with a service like Kubernetes that may be running replicas of services across multiple pods or EC2 instances. It's simply more efficient using a docker image then pulling the repo 3 times, installing and building all the dependencies.

  • Imitating Prod Dependencies Locally & Quickly: quite often when working in a professional environment you may be working on a front end codebase that requires access to a a postgres db for storing some site documentation while also requiring access to a storage solution like Redis for managing user session storage. It can be quite time consuming setting up your development environment as a result of these dependencies, however Docker have a useful tool docker-compose that will allow you to run a postgres and redis setup with a single command 'docker-compose up -d' with a yaml file containing less than 10 lines of code.

I'll quickly acknowledge at this point that Docker is not the be all and end all solution. Inductor has an interesting article Do you really need docker or kubernetes in your system link that I highly recommend checking out.

As a college graduate or new developer being able to demonstrate interest and knowledge in how you would plan to make your project scalable and deployable in a production environment can give you the edge over other candidates in interviews. Having a basic Dockerfile setup in your personal projects gives you another unique talking point and skill to reference in your interviews.

Integrating Docker with your beginner projects

I took on the challenge to go about integrating Docker into some previous projects I've worked on to help provide some end to end insight on the practical use cases of the tooling and how easy it can be to add this extra functionality. I've previously written articles on the development of these basic projects so beginners can gain a better insight into how a project can develop new features over time by reviewing these past articles.

There is a general pattern adopted across all three of these projects which you will notice (and which emphasises how easy it can be to generate these files)

  • Use a base image i.e. node/openjdk/golang
  • Set the Maintainer
  • Set the working directory and add the relevant files
  • Set any ENV vars i.e. ports/versions -> this allows easy updates later
  • Install your dependencies using 'RUN'
  • Expose your required port/ports
  • Run your main command i.e. npm start

Now lets have a look at the generated files.

JavaScript: React Portfolio - Code - Article

This project evolved from a simple create-react-app.
The generated Dockerfile looks as follows:

Dockerfile

FROM node:8

MAINTAINER Daniel McMahon <daniel40392@gmail.com>

WORKDIR /opt/react-portfolio

ADD . /opt/react-portfolio

ENV PORT 3000

RUN npm install

EXPOSE 3000

CMD npm start
Enter fullscreen mode Exit fullscreen mode

There is an NPM package that might help you auto generate Dockerfiles in future. Checkout the dockerfile-generator package available over at NPM.

Scala: The Inspiration API - Code - Article

This project evolved as a basic Scala/Play application built from SBT. This service has a dependency on a postgres DB that can be setup locally using docker-compose.

Dockerfile

FROM openjdk:8

MAINTAINER Daniel McMahon <daniel40392@gmail.com>

WORKDIR /opt/inspiration-api

ADD . /opt/inspiration-api

ENV SBT_VERSION 0.13.15

# Install sbt
RUN \
  curl -L -o sbt-$SBT_VERSION.deb http://dl.bintray.com/sbt/debian/sbt-$SBT_VERSION.deb && \
  dpkg -i sbt-$SBT_VERSION.deb && \
  rm sbt-$SBT_VERSION.deb && \
  apt-get update && \
  apt-get install sbt && \
  sbt sbtVersion

EXPOSE 9000

CMD sbt run
Enter fullscreen mode Exit fullscreen mode

docker-compose.yaml

inspiration:
  container_name: inspiration
  image: postgres:9.6-alpine
  ports:
    - '5432:5432'
  environment:
    POSTGRES_DB: 'inspiration_db'
    POSTGRES_USER: 'user'
Enter fullscreen mode Exit fullscreen mode

In order to correctly have the Docker container running the service communicate with a local DB setup you will need to run it with the following port setup:

# setup db dependencies
docker-compose up -d
psql -h localhost -U user inspiration_db -f dbsetup.sql

# build and tag image locally
docker build -t inspiration_api:v1 .

# port forwarding Docker to localhost:9000
# note: this seems to be broken atm - I'll update when fixed!
docker run -ti -p 9000:9000 -p 5432:5432 <docker-image-id>

# publish docker image to docker hub
docker push <docker-repo>
Enter fullscreen mode Exit fullscreen mode

Interested in automating this process? There are a few Scala libraries that will auto generate and push docker images for you. Checkout marcuslonnberg's sbt-docker plugin.

Golang: Lucas - Code - [article incoming...]

This project is a basic web crawler built using Go and the Colly library. This service has a dependency on a postgres DB that can be setup locally using docker-compose.

Dockerfile

FROM golang:1.11

MAINTAINER Daniel McMahon <daniel40392@gmail.com>

WORKDIR /opt/lucas

ADD . /opt/lucas

ENV PORT blergh

# installing our golang dependencies
RUN go get -u github.com/gocolly/colly && \
  go get -u github.com/fatih/color && \
  go get -u github.com/lib/pq

EXPOSE 8000

CMD go run lucas.go

Enter fullscreen mode Exit fullscreen mode

docker-compose.yaml

lucas:
  container_name: lucas
  image: postgres:9.6-alpine
  ports:
    - '5432:5432'
  environment:
    POSTGRES_DB: 'lucas_db'
    POSTGRES_USER: 'user'
Enter fullscreen mode Exit fullscreen mode

In order to correctly have the Docker container running the service communicate with a local DB setup you will need to run it with a network flag in host mode:

# setup db dependencies
docker-compose up -d
psql -h localhost -U user lucas_db -f dbsetup.sql

# build docker image
docker build .

# run docker container and portforward port 3000
# the host network flag will allow your container to speak to your local DB
docker run -ti -p 8000:8000 --network="host" <docker-image-id>

# publish docker image to docker hub
docker push <docker-repo>
Enter fullscreen mode Exit fullscreen mode

Closing Thoughts

As you can see from the above its relatively straight forward to get your applications up and running on Docker. It can be a little tricky when you first get used to the concepts behind Docker but once you get the hang of it the advantages it offers are fantastic.

There is a super cool tool I stumbled upon this week that will let you analyse the quality of your docker images and give you a % rating for how well optimized it is i.e. if you have too much bloat from installed packages etc. Go check it out - the libraries called 'dive'.

The natural next step from 'Dockerising' your applications is to look at deploying it through some sort of CI/CD solution. I hope to try take a look at setting up your own Kubernetes cluster using a library called kops and deploying the above services through it.

Any feedback/thoughts/suggestions feel free to comment below.

See you next time!

💖 💪 🙅 🚩
danku
Daniel McMahon

Posted on December 4, 2018

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related