How to Use Docker for Node.js Development?

Docker For Node JS Development

2013 saw the advent of Docker in the tech and development world. Since then, Docker has now managed to become the hottest trend as far as development is concerned.

The popularity of Docker can get understood by the fact that all significant and minor tech corporations are adopting the same. Multiple developers rely on it, yet some have never used Docker or have got exposed to its intricacies.

This article is for them❗

If you are looking forward to using Docker for the very first time for Node.js development, this article is going to guide you through the whole process.

What is Docker?

‘Let’s first get to the basics. ‘Let’s understand the fundamental fabric of this topic of discussion, i.e., Docker.

Docker

Docker is nothing but an open platform that assists in building and developing distributed applications. In fact, with Docker, you can also ship and run those applications.

Docker offers developers and programmers with the fundamental tools that they require to take advantage of the distributed/networked nature of the modern-day applications.

Put in simpler terms; Docker is instead an abstraction and not just an idea, the abstract notion that regulates the low-level operating system tools.

It is these tools that help developers run a containerized app development procedure with a couple or more virtualized Linux illustrations.

Why Should You Use Docker?

There are multiple reasons as to why should you choose Docker over everything else❗

Of those, here are the top four reasons:

  • It’s consistency
  • It’s speed
  • Its isolative powers
  • Its layered system

Advantages Of Docker

Consistency here refers to the consistent environment and ecosystem that your application gets provided with during development and production. That is the reason why every Node JS Development Company loves that feature.

Speed here refers to the amazingly fast way to run a completely new procedure over the server. Docker gives you the benefit of preconfigured images that take the overall challenge and complexity of running the entire process all over again.

Isolation here refers to the default mechanism of every single Docker container. Every separate container functions in an isolated manner from its network, its file systems, and other procedures.

The layered system here refers to the intricate and delicate manner of adding a new layer to your files. Beginning from the base image, every single change that you add to your development process ultimately becomes a new layer to the overall file system.

This way, the system layers end up being cached, which reduces the total number of repetitions during the development process.

The Ultimate Advantage

Whenever you opt for Docker, you choose for an API that gets you a container management system that benefits the developers as well as the system administrators. You are always at an advantage when it comes to choosing Docker.

  • The agile application development process
  • Extremely portable through a multitude of devices
  • No lack of compatibility throughout devices and networks
  • Reusable components
  • Ability to control layers and versions
  • Easy and remote sharing
  • Minimal overheads
  • Easy to maintain

Simplifying Docker-based Node.js Development

When it comes to active Node.js application development, Docker can get incorporated to simplify your entire workflow and the deployment process.

As we have stated above, the benefits you gain when you opt for Docker for Node.js development are abundant.

Docker Architecture

The overall development ecosystem is exceptionally consistent, which offers you the advantage of choosing your language without any conflicts in your system.

The ecosystem is also pretty isolated, in the sense that it becomes easy for you, as a developer, to troubleshoot major and minor issues under all kinds of circumstances.

The portability of the environment here also allows developers and programmers to package and share the codebase with anyone on the planet.

The benefits of using Docker for Node.js In Web App Development are therefore endless, which is why it is immensely important to learn how to use and set up the environment for Docker-based Node.js development.

The Tutorial

What follows from here will be divided into two major sections: the development and the creation.

Let’s first cover the development and Docker compose testing process.

For this, you will be required to have a few essential items to get started.

  • Docker Compose
  • Docker Community Edition, and
  • Todo App

Here, to fulfill the requirements of the tutorial, the Todo app will act as a stand-in, which can get replaced with your application.

Once you have managed to allocate all of the items mentioned above, you can now move on to step two.

Creating a Docker File

Docker file is the fundamental of any Dockerised application. Docker file is composed of all the necessary information that helps build out the application images. This can be set up by installing Node.js.

Create a new Docker file in the root directory of the application.

Once done, you will now be required to open this newly created Docker file in your choice of editor.

The instructions provided will tell Docker to take the help of the pre-built Node.js images. Here, you will be given a variety of options, choose the node: 7.7.2-alpine image.

So far, with the limited amount of choices you have made, you will not be able to do much from your Docker file.

However, it will provide you with the ability to build the process without a lot happening. The new image created now and running over Docker images will show you a glimpse of all the photos you already have.

This Docker file will, of course, require more instructions to build the application. Currently, the only image created is with the help installed Node.js.

However, to ensure a smooth-running application code in the container, you would be required to add a few more instructions to build this image again. For that purpose, you can contact and Hire NodeJS Developer from a reputed firm.

The extra instructions that require addition are:

FROM node:7.7.2-alpine
WORKDIRE/usr/app
COPY package.json.
RUN npm install --quiet
COPY . .

In other words,

You are first setting up the working directory to /usr/app and then copying the package.json files to the same.

You are now looking at the installation of node_modules, from where all the files shall get copied from the project root, i.e.,/usr/app.

Once all of this is done, you can run the Docker build now to see the results.

Docker file image has now got created successfully.

Things seem smooth right now, but connecting all the loose ends is still left to be done.

Add Docker Compose

For a smooth development process, while connecting all the loose ends, we also need to incorporate docker-compose file.

In the development environment, we require, making a node app with the help of the official node is a great option.

Application service will be created and connected to the database. This way, we will be able to run specific setup commands to create the necessary list.

Here, create a docker-compose.yml file first.

/> touch docker-compose.yml

Docker composes file here will be well-defined to run the docker containers based on the configuration files.

A significant thing to understand here that Docker composes bridges the build time and the runtime.

Till now, building images have been built with the help of Docker created, i.e., make time. This is precisely when the containers get developed and produced. Runtime here refers to the time wherein containers are built and used.

Docker composes triggers the total build time, that instructs the images to build better. At the same time, Compose also helps populate the data with the help of runtime.

In the editor, open the Docker compose file and paste the following:

version:'2'
services:
 web:
  build:.
  command:npmrundev
  volumes:
  -.:/usr/app/
  -/usr/app/node_modules
 ports:
  -"3000:3000"
 depends_on:
  -postgres
 environment:
  DATABASE_URL:postgres://[email protected]/todos
postgres:
 image:postgres:9.6.2-alpine
 environment:
  POSTGRES_USER:todoapp
  POSTGRES_DB:todos

This might take a little long to unpack, but it will be an easy way out.

Web Service

The first directive here as far as web services are concerned is to build the images based on the Docker file. In this manner, images can get recreated as per your convenience.

However, they can be named as per the kind of project you are working upon. Once this is done, you can now provide a specific set of instructions for its future operations.

Certain instructions are:

Command: - npm run dev, used to run the container on starting the application.

Volume: - used to bridge the gap between the container and the host.

. : /usr/app/ - To mount the root directory to the working directory

/usr/app/node_modules - to mount the host machine and the node_modules directory with the help of buildtime directory

Postgres Service

Docker Store houses pre-built images for PostgresSQL, just like the NodeJS images used earlier.

Instead of taking the help of the Build directive, you can use the image name and Docker will get you the image, ready to use.

The in-built Docker environment variables allow you the customization of the picture.

Running the Application

Once the services are well-defined, the final application can be built with the help of Docker-Compose up. It will show the images that are being made and eventually start.

Once the initial build gets completed, you will witness multiple names of the containers that are being created.

Currently, your application will be running, and you will also witness the log output in the console.

Simultaneously, you will also run the services in the background with the help of Docker-compose up -d.

If your personal choice is to run the process in the background and simultaneously view the logs, you can use the docker-compose logs instead of -d.

Once all of this gets completed, you will see a new command prompt, which needs to be running as docker-compose ps to get viewed along with the running containers.

Details like the name of the services will be told to you, along with the commands that are necessary to start using them, and their current state and ports.

Package.json file script can be used to build the code automatically and can be used to migrate the schema to the PostgreSQL efficiently.

This data, as well as the schema, will be preserved to live perpetually unless the image is not removed by any means.

However, it will turn out to be a great option to check your application building process and the setup process, whether it is clean or not.

You will be able to run the docker-compose down that will allow you to set things in motion, clear them up as far as building the app is concerned.

You will witness what is happening with a fresh perspective.

Read also: New To React? Check Out Core Concept Of React You Must Know

Application Testing

The application testing process involves an intricate process of running specific integration tests with the help of jest.

There are a few ways to move forward, as far as testing of your application is concerned, which includes creating a Dockerfile.test or creating the docker-compose.test.yml file, that specifically caters to your requirements in the test environment.

In the current app development ecosystem, you can use the existing containers and run them under the project name.

The name could be or could not be the directory name. If any developer moves ahead to run commands, the same project can get used, and the containers can get restarted.

However, this might not be the best way to move ahead. To move forward efficiently, you can use a different name to run the application.

You can break the testing process into fragments, isolate them individually as per their own environmental needs.

Now, one more thing to understand here is that since the containers do not end up living perpetually, they are short-lived, running your testing process in a separate container will ensure that your application behaves the way it is supposed to in a clean environment.

To do this, you should look forward to running the following command in your terminal:

/> docker-compose -p tests run -p 3000 --rm web npm run watch-tests

Once you do this, you will witness jest running through the test integration. Here, at this point, you would need to wait for changes to show up.

The docker-compose command here will accept multiple options, that are going to be followed by yet another command.

Here, you can use the -p tests, run the services under the project name, and execute the commands to run. A one-time execution can get done against the services at this point.

Now, because the docker-compose.yml file specifically mentions the port, you can use a specific port as per your requirements.

If you are following the example presented here, you can use -p 3000 to replicate the same. This will create a completely random port to prevent any form of collisions.

Moreover, the –rm option will also get rid of the containers when the containers are attempted to stop. Finally, the web service npm run watch-tests is running.

Conclusion

The tutorial gives you an in-depth analysis of how Docker containers can get used for developing Node.js applications.

By doing this, you have managed to make your project portable and modular by extracting sensitive information, along with decoupling the state of your application from the code itself.

Docker-compose.yml can also be configured this way to review and revise the development process and can change it as per your personal development choice.

Currently, you have a strong head start, as far as Docker Compose for Node based app development is concerned.

hire-node-js-developer

Disclaimer: We at eSparkBiz Technologies have created this blog with all the consideration and utmost care. We always strive for excellence in each of our blog posts and for that purpose, we ensure that all the information written in the blog is complete, correct, comprehensible, accurate and up-to-date. However, we can’t always guarantee that the information written in the blog correct, accurate or up-to-date. Therefore, we always advise our valuable readers not to take any kind of decisions based on the information as well as the views shared by our authors. The readers should always conduct an in-depth research before making the final decision. In addition to these, all the logos, 3rd part trademarks and screenshots of websites & mobile apps are the property of the individual owners. We’re not associated with any of them.

Mohit Surati

Mohit Surati

A post-graduate in Computer Engineering. Passionate about innovative writing & Content Marketing. A rich vein of experience in writing articles related to WordPress right from plugins, themes, customization, security, WooCommerce & Gutenberg. Pens down his thoughts for eSparkBiz @Custom WordPress Development Company where you can Hire WordPress Developer as per your requirements.

Related Post

Step-By-Step Guide To Build Microservice Architecture With Node.js

Step-By-Step Guide To Build Microservice Architecture With Node.js

What if I tell you that you can develop the most scalable application out there while ensuring most efficiency and customer satisfaction rate at the…

Why Should You Opt for Node.js Framework in Web App Development?

Why Should You Opt for Node.js Framework in Web App Development?

So, you have a great business idea and need to develop a website? Or, you have just ventured out into the digital world to give…

Things To Keep In Mind Before Choosing a Payment Gateway For Website

Things To Keep In Mind Before Choosing a Payment Gateway For Website

For any business, it is imperative to choose the best Online Payment Gateway. eCommerce is the fastest growing industry holding immense potential. There’s a multitude…

get in touch image

Growth Is Just One Click Away

Don’t feel like calling? Just share some project details & our company representative will get in touch. Schedule A Meeting with our Director of New Business

How long would you like the meeting to be?
Get In Touch