Ever wondered how you can make anything run on autopilot?
So it began!
As of today, I have multiple projects that I’ve completed. For the sake of focusing on learning Docker, I used a simple Node.js project that I have developed in Term 3 at Make School. The idea of the project is an API that serves as a database, storing html components built using Bootstrap and returning as json data individually by a name or by list. Such components are used very often when developing an html website. Think of Navigation bars, Image carousels, cards and footers; have you ever used them to build your website?
The purpose of building Bootstrap API is to serve as a library for my side project, web.io; A website builder using voice recognition and bootstrap components to build elegant html templates. There is no need to write code, you can click or drag components on to the canvas, or use voice commands to achieve the same goal.
- Voice Recognition Bootstrap Builder
Together, Bootstrap API and Web.io, allowed me to reduce code complexity and helped me concentrate my attention to specific modules of the code. Thus, establishing a successful connection between the two was established. At this point, my overall idea is achieved and my project is now standing as a bike model.
Problem & Solution
To move my project from a bike model to car model, I had to solve the connection problem between Web.io and Bootstrap API. There were too many calls being made to get elements from the API on the builder canvas. Bootstrap API was struggling to respond to the staggering amounts of requests coming from web.io. Luckily I took BEW 2.2: Docker, Deployments, & DevOps at Make school.
‘Docker enables developers to easily pack, ship, and run any application as a lightweight, portable, self-sufficient container, which can run virtually anywhere. As Bottomley told me, “Containers gives you instant application portability.”’ (Steven J. Vaughan-Nichols for Linux and Open Source)
Docker enables me to build a container image and use that same image across every step of the deployment process. In other words, using docker means that my images run the same no matter which server or whose laptop they are running on. For developers, this means less time spent setting up environments, debugging environment-specific issues, and a more portable and easy-to-set-up codebase. This allows for production infrastructure to be more reliable and easier to maintain.
- Docker Containers
From a security point of view, Docker ensures that my applications that are running on containers are completely segregated and isolated from each other, granting me complete control over traffic flow and management. “No Docker container can look into processes running inside another container. From an architectural point of view, each container gets its own set of resources ranging from processing to network stacks.” (Ekaterina Novoseltseva)
Docker builds images automatically by reading the instructions from a Dockerfile, a text file that contains all commands, in order, needed to build a given image. ‘FROM’ Command creates the layer that is used as Docker image. In my case, FROM node:13.3 sets Docker image to node version 13.3. The WORKDIR sets the working directory of the image to any name folder, I used /app. The COPY commands copies all files from location A to B. First, COPY is used to copy my package.json dependencies to /app. Then the RUN command executes a command line argument. So, I used it run npm install. Then I copied the whole project to the /app, used EXPOSE command to open a port to connect to the main project port. And finally, CMD command specifies the commands to run within the /app which in this case, node index.js runs the project.
Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application’s services. Then, with a single command, you create and start all the services from your configuration. (Docker docs)
In docker-compose file, I am declaring the version of Docker and two service applications. Web, which contains an image name docker-bootstrap-api, build command that builds the project as a whole, command to run the project, port mapping, volume ensures that the data is there, and depends_on state what the project needs to run.
The biggest challenge I faced using Docker was debugging. When connecting the project locally without Docker, it performs well. However, when using Docker, the project does not display on the browser. Trying to debug the error message was not easy. There wasn’t a clear indication of where the error is coming from. Through trial and error, I figured it out. The issue was coming from test files in my projects. The assert statement was producing the error, and the port mapping between docker-compose and mongoose were not correct. Removing assert statements and mapping ports correctly solved the issue.