Tutorial

Why Containerize Your Gatsby Application?

Published by

Luis Osta

featured image Why Containerize Your Gatsby Application?

Ease Of Development & Deployment

Anyone who has had to work with other developers working with different Operating Systems knows the strange bugs and issues that come up due to OS specific issues.

Furthermore, managing the dependencies and configurations needed for production deployment can be extremely time consuming.

Docker fixes these issues by:

  • Making it easy to get your application running, with all of the dependencies, locally with Docker Compose
  • Declaratively define the system dependencies and the infrastructure configuration for part of your application.
  • Allowing you to isolate portions of your application into containers and optimize each container according to its needs (i.e configuring NGINX with the Gatsby container but not for others).
  • Easy integration with cloud providers and CI/CD pipelines.

There's also the benefit of the ecosystem of available Docker images, that allow you to start from good baselines, in Docker Hub.

You can learn more about the history of containers here:

Why use Docker?

Ease Of Integration With Complex Web Systems

Due to the aforementioned configurability of the containers, and the power that Docker Compose allows, its easier to isolate different services of your application and integrate them together.

Docker Allows You To Bake In All Your Configurations

For instance, you could have a NGINX router that passes request either to your Gatsby client or API depending on the request type. Then serving your Gatsby client with another NGINX server that is configured to optimize serving static files. (Such as leveraging Brotli compression)

This application utilizes the following containers:

  1. NGINX router container
  2. API container
  3. Gatsby client container

Which can be connected together via Docker Compose. Instead of having to create the NGINX setup via terminal on your production/development/testing server, you can bake them into the containers.

You can learn move about docker compose here.

Docker Makes It Easy To Scale

This containerization of all pieces of your application allow you to easily test and deploy them as part of a CI/CD pipeline. Especially considering the support Docker has across all cloud providers.

And if your application has intense requirements and needs to be highly scalable, Docker allows you to leverage Kubernetes. Kubernetes allows even more powerful composition and management of the services.


Building Out The Foundations

Prerequisites

  1. Node (Should include NPM)
  2. Docker
  3. Gatsby CLI

If you already have these installed you can skip to the Gatsby Overview section.

Installation

The following section focuses on going through the installation steps of everything we'll need for our purposes, walking through the options available in all of the major operating systems.

Windows

You can download Node and NPM either by stepping through the wizard form on Window installing packages that allow you to manage the specific versions of Node and NPM you're using.

There are two major packages that allow you to achieve this in Windows:


Then, you will have to install Docker on the computer you will be working from. On windows, you can install the Docker Desktop application from the Docker Docs.

Note On Docker For Windows

The installation of Docker for Windows requires the user to have the paid version of Windows, which many developers do not have or want.

In this case there are three options available to you to setup Docker in your computer:

  1. Virtual Box - You can setup Virtual Box in your computer, which allows you to run a virtual machine on your windows machines that runs a version of Linux. This allows you to run the Linux version of Docker.
    1. Alternatively you can try to get Docker setup in WSL. Some people have been able to get it to work with Windows Subsystem For Linux, you can check out this tutorial to try: https://nickjanetakis.com/blog/setting-up-docker-for-windows-and-wsl-to-work-flawlessly
  2. Dual Booting - Alternatively, if you plan to use Docker more extensively and for most of your projects, it may be more beneficial to setup dual booting on your computer. Allowing you to have both Windows and your preferred flavor of Linux on your computer to use.
    1. Try this tutorial to get setup with dual botting: https://itsfoss.com/install-ubuntu-1404-dual-boot-mode-windows-8-81-uefi/
  3. WSL 2 - The final option is to setup WSL 2 on your computer, which is an involved process but slightly easier than Dual Booting. You can follow this guide released by Microsoft to setup WSL 2 in your computer

Mac and Linux

In Mac you can also step through the wizard form similarly to on windows. But you can also install a node version package manager.

UNIX-based systems (such as Mac and Linux), you have available to another package manager not on windows.


Then you will have to install Docker, which depends heavily on the specific OS and distribution that you have.

  1. Macs

    If your working computer is a Mac, I would advise installing the Docker Desktop application, it will be easier to get setup and working.

  2. Linux

    If your working computer is a Linux, you can follow the installation from source steps for your specific Linux distribution.

    Unlike Mac and Windows, you won't have a Docker Desktop application, and will only have the Docker server(which offers the same functionality simply without the GUI).


Gatsby Overview

Once you have Node and NPM installed on your computer, you can install the Gatsby CLI by running the command:

1npm install -g gatsby-cli

Although we will run the actual Gatsby server on a Docker container, we will use the CLI to generate a basic site from the available Gatsby Starters.

Cloning The Gatsby Starter

We will be working off the default Gatsby Starter, which you can preview here.

First, we'll create a folder where we'll store the entire application and then generate the Gatsby Starter within that folder.

Open your terminal, move to your preferred working directory and type in the following commands.

Base Folder & Clone The Gatsby Starter

1mkdir docker-gatsby
2gatsby new client https://github.com/gatsbyjs/gatsby-starter-default

You can test to see if the installation went properly by running the following commands and you should see the default starter on localhost:8000

1cd client
2npm install
3npm start

Development Container For GatsbyJS

For our purposes we want our development container to cover allow all of the features that Gatsby provides when running locally and keep the container as small as possible.

So we'll make sure follow these guidelines

  • Use the minimal base Docker image necessary
  • Download all the system requirements necessary to run Gatsby
  • Allow for live updating of the code

Before we start that you'll have to make as small change to the develop script your package.json.

Change the script to this:

1"develop": "gatsby develop -H 0.0.0.0"

The -H configuration sets to host number that the Gatsby server will be listening on. The default host is localhost(127.0.0.1), but since our Gatsby server will be inside a docker container, it won't be accessible to the outside world.

Instead, the host number 0.0.0.0 will allow us to be able to access the Gatsby server running inside the container. Similarly it'll also allow other Docker containers to interact with the Gatsby Server. You can read more about the difference between localhost and 0.0.0.0 here.

Here's the development Dockerfile that we'll serve our needs, we'll walk through each part and what it achieves.

Dockerfile.dev

1FROM node:alpine # The minimal baseline we need for Nodejs
2WORKDIR /app
3
4# COPY the package.json file, update any deps and install them
5COPY package.json .
6RUN npm update
7RUN npm install
8
9# copy the whole source folder(the dir is relative to the Dockerfile
10COPY . .
11
12CMD [ "npm", "run", "start" ]

You can read more about the differences between using the RUN command and the CMD command here.

We'll pair the above Dockerfile with docker-compose for easy management of all of the docker containers we'll see for our application.

docker-compose.yaml

1version: "3"
2services:
3  client:
4    build:
5      context: .
6      dockerfile: Dockerfile.dev
7    volumes:
8      - ./src:/app/src # Links the source files to the running container
9    ports:
10      - "3000:8000"

Build

The build configuration allows you to directly specific how you want Docker compose to build the container. In the above example, we give it a context and the container dockerfile.

The context specifies the working directory for that service. The dockerfile option allows us to specify the filename(including extension) where we defined our Docker image.

Volumes

The volume section will link the src folder in the Docker container with your local version. This allows you to create live changes without having to restart your container.

Important Note On Volumes

There are a lot of different ways to setup a volume for development, with different trade-offs.

The one in the above only links the source files, so when you want to install a new package you will have to rebuild the container.

To do this you can simply run:

1docker-compose up --build

This is because the package.json and every other file isn't linked, and so any updates will not be reflected on the running container.

If you choose to instead make the whole client directory linked to the container, the package.json will be updated.

But in order to utilize the installed package you will have to run "npm install" inside the running container.

You can do this by running

1docker exec -it <INSERT CONTAINER ID> sh

And then executing npm install within that shell.

Why Only Link the Source

After trying every different approach I could find, ultimately for development, I decided to only link the source code for a few reasons.

  1. The reason we had the volumes has so we could update the running container, which the vast majority of the time only involves the src/ folder.
  2. When updating other Gatsby specific files, you're still going to have to restart the Gatsby server which means you'll have to stop the Docker container.
  3. Linking the whole client directory results in the .cache, and public folders to be mirrored locally. Which can also cause strange issues when stopping a container that results in dozens of cache files to have to be deleted individually
  4. By only linking the source file, you can have the development dependencies downloaded locally (which can be used by vscode/other text editor) without interfering with your running container or vice-versa.

There are valid reasons why you'd want to configure your volumes differently.

But I've found that the above configuration balances developer experience and best practices quite well when developing Gatsby applications.

If you think you'd found a way to configure your Docker Compose in a smoother and more intuitive way, let me know! Either below in the comments or reach out to me personally.

Ports

The ports section will take map the internal ports inside with the Docker container with externally accessible ones. So in our instead we will access our Gatsby application through localhost:3000.

Running Development

Once you have all of your files setup, you can start your containers with the following command:

1docker-compose up

Or if you need to rebuild your Docker image(the Dockerfile.dev)

1docker-compose up --build

The first time you build the docker image and run docker-compose up, it will take a very long time. But every time afterward will be magnitudes faster.

Production Container For GatsbyJS

A good heuristic when developing Docker containers is to assume that your host machine isn't able to either install the application's dependencies or build the application itself.

This will help you develop containers that can be easily re-used and deployed to production.

Changes From Development To Production

In order to move our website into production, we will have to serve the output of the Gatsby build via NGINX. For this we'll make two new files to be used in production:

  • Dockerfile
  • docker-compose.prod.yml

Our docker compose file will remain mostly unchanged, most of the changes will be to the Dockerfile.

Dockerfile

1FROM node:alpine as builder
2
3WORKDIR /app
4
5COPY package.json .
6RUN npm install
7COPY . .
8RUN ["npm", "run", "build"]
9
10FROM nginx
11EXPOSE 80
12COPY --from=builder /app/public /usr/share/nginx/html

Builder Phase

In Docker, you have specify different phases to have different base images. These stages are specified with the 'FROM' key word.

This allow us to easily build our website and then copy our files over to the NGINX server.

Serve Files Via NGINX

Gatsby will build the output over to the public folder in the working directory. Hence, we can just copy this over to the default directory that NGINX checks to server static files.

You can review more about the default directories (yes there are multiple) here.

We utilize NGINX since its an incredibly fast web server that's great for request routing and serving static files. Its also incredibly configurable, allowing you to use compression algorithms to even further optimize performance.

docker-compose.prod.yml

1version: "3"
2services:
3  client:
4    build:
5      context: .
6      dockerfile: Dockerfile
7    ports:
8      - "80:80"

Docker Compose In Production

There are very little changes to our compose files, only changing the dockerfile and port mapping.

Though for more complex applications, your production configuration may require more complex environmental variable and logging changes.

There are more configurations that you can keep in mind when using Docker Compose in production, which can be found in the official docs.

Why Change The Port Mapping

The port mapping change is required since port 80 is the default part that will allow the website to be accessed without having to specify the port.

Running Production

Once you're done working on your Gatsby website, and finished with development, you run the production version of your website via:

1docker-compose -f docker-compose.prod.yml up

And make sure to add the build tag if you have updated your gatsby website.

1docker-compose -f docker-compose.prod.yml up --build

Conclusion

Congratulations! Now you should have a smooth development environment for your Gatsby applications and a production configuration ready for deployment.

Folder Structure

1client/
2	...
3	Dockerfile.dev
4	Dockerfile
5docker-compose.yml
6docker-compose.prod.yml

The following should be the general structure that you should have in the end.

Further Steps

If you need more performance or have more complex applications you should look into other services and configurations that you can include to server your needs.

Here are some ways that you could expand on this setup:

  • Create a NGINX Reverse Proxy (separate from the one serving the Gatsby site) to send the requests to either the Gatsby client or an API
  • Optimize the Gatsby NGINX configuration to use more robust compression algorithms (Gzip or Brotli compression) and other performance optimizations.
  • Create a file watcher that will automatically update a running container's npm modules (for development
icon_emailaddress

Get our latest software insights to your inbox

More insights you might be interested in

Are Certifications Worth It?

Balanced Analysis

Are Certifications Worth It?

Explore various perspectives surrounding the controversial issue of the value of certificates in the tech community. Get an in-depth rundown of the pitfalls that arise from the use of professional certificates and how to use certificates optimally.

Building Scalable Styling Architecture in React

Architecture Breakdowns

Building Scalable Styling Architecture in React

Explore the principles that create scalable styling architecture in react and learn practices that can achieve this desired architecture.

Our Failed Migration To Netlify CMS: Post Mortem & Lessons Learned

Deep Dive

Our Failed Migration To Netlify CMS: Post Mortem & Lessons Learned

A retrospective outlook on our process of migrating to Netlify CMS and the challenges we faced.

Seeding Data Into MongoDB Using Docker

Tutorial

Seeding Data Into MongoDB Using Docker

Learn how to seed data into MongoDB using docker and docker compose and use the generated data in a React application, through step-by-step instructions.