Developing inside a Docker container

A few months ago I got a new computer and I have been very intentional about deciding what I install on it. From past experience I know that computers that are used as a development environment tend to get messy in no time since one might install all kinds of libraries, frameworks, dependencies, you name it, and to makes matters worse, you will probably run into version conflicts for any or most of those things. Hardly ever a development environment is a clean environment and I don’t know about you but there are very few things that I find more frustrating than wasting time troubleshooting development environment set up. Let me write the code already!

With that in mind, I decided early on that I would avoid installing node.js on this computer, for example. In my experience, Node is notorious for giving lots of headaches with version conflict. The Node Version Manager (nvm) can only do so much to alleviate the problem and I find it clunky. So, no, thanks.

Well, then smarty-pants. How do you do full stack web development these days without using nvm you ask me. Excellent question! The answer: Docker.

I’ve written about Docker in the past and I just plain love it. It took me some time to understand what it does and which problems it solves but once I did, it became my go-to solution to keep things under control: you can isolate a development environment with all dependencies and runtime that your project needs. If your friend wants to run your project, they get the container and voilà, the project runs on their computer without they needing to install all dependencies locally. Beautiful! 😍

So, a few weeks ago I started a new course to learn Gatsby and this was the perfect scenario to test my Docker development environment.

Docker image for a dev environment

The first thing I did was to create a base image with node.js and a few utilities installed. Here’s the Dockerfile for the image I used:

# build with: docker build -f Dockerfile -t image_name .
# run with: docker run -it –name container_name image_name /bin/bash
# run exposing ports and sharing volume:
# docker run -it –name container_name -p 8000:8000 –mount type=bind,src=/your/path/to/local/source/code,dst=/src image_name /bin/bash
FROM debian
RUN apt-get update
ENV DEBIAN_FRONTEND noninteractive
RUN apt update -y && apt install vim -y && apt install -y procps
RUN apt-get install -y curl
# Install Node.js PPA
RUN apt-get install -y software-properties-common
RUN curl -sL https://deb.nodesource.com/setup_14.x | bash –
# Install Node.js
RUN apt-get install -y nodejs
view raw Dockerfile hosted with ❤ by GitHub

Note about this setup: I use debian as a base image but if you care about image size, consider using alpine instead.

On the file above I have also highlighted in the comments how to 1. build the image and 2. two options to run the image. These are the two steps you need to take to start using this image as a container for your development environment.

Choosing how to run the image

If all you care is to have a “start point”, or clean slate if you will, run as the first option suggests. That will put you inside the container in a prompt on the root folder. You can then run other install commands .

If you are using this image as a development environment (like I am), you will want to run as the second option (the longer docker run command). This command does 2 extra things that will be super helpful: 1. expose container ports so you can access the project from your browser (more about this later) and 2. map the code you are writing on the code editor in your computer to a folder inside the container so the container can “see” the changes to your code. Yes, pretty much essential.

For this example, I have this repository that I cloned from GitHub and it’s a Gatsby application. So I will run the second docker run command making sure I am using the correct path to my cloned repo.

Inside the container

Once I have the command prompt inside the container, I can navigate to the place in the repository that holds the package.json file and then run npm install. This will install all the projects dependencies inside the container.

Next, I can start the development server by running gatsby develop.

I get the message that I can now view my project in the browser:

  Local:            http://localhost:8000/

Not so fast, my friend!

However, when I go to localhost:8000 I get an ERR_CONNECTION_RESET. I tried 127.0.0.1 instead but I still got nothing. If I list my running containers (with docker ps), I see that it’s running on 0.0.0.0 and I thought that 0.0.0.0 was another way to say “127.0.0.1” or “localhost”… Why is it not working? 🤔

└❯ docker ps
CONTAINER ID        IMAGE               COMMAND             CREATED             STATUS              PORTS                    NAMES
8a12a061be10        gatsby              "/bin/bash"         10 minutes ago     Up 2 minutes       0.0.0.0:8000->8000/tcp   my_project

Well, it turns out that when running applications inside a container, localhost is the container itself and not your workstation anymore. So you need to tell the container which host it should serve the application from. However, containers have dynamic IP addresses so you don’t know beforehand which IP address the container will take.

What do I do now?

The fix for this problem is to give the application a “placeholder” IP address. 0.0.0.0 is that placeholder and it means “all IPV4 addresses in the local machine”. In this case:

gatsby develop --H 0.0.0.0

Now, the message is different:

  Local:            http://localhost:8000/
  On Your Network:  http://172.17.0.2:8000/

And both these addresses now serve my project! 😊

So this is it. I can now change my code and see the changes on the browser just fine.

Another option

If you use VSCode as your editor, it has now an extension called “Remote – Containers” that will open your repository inside a Docker container for you (no need to build the image) and allow you to manage the container from its own UI. Note that you still need Docker installed locally for this extension to work.

One thing to note is that it’s possible to manage the port exposing through VSCode and using this project as a test, I did not need to specify any host for my development server command. The extension provides a way to expose the ports (select the one the project is running on, right-click and “forward the port”):

Port forwarding in Remote-Containers in VSCode

The project is now accessible on 127.0.0.1:8000 in the browser.

For more information on using the VSCode Remote Containers extension I recommend this excellent article, that goes into a lot more detail than I did!

I hope this post helps you keep your development environment organized.


If you found this helpful, let me know on Twitter!

The post Developing inside a Docker container was originally published at flaviabastos.ca