Hacker's Handbook


Dev Containers: Consistency in Development

Finding Freedom in Confinement

Posted: 2023-07-24

Dev Containers: Turtles all the way down

Dev Containers might just be the key to streamlining your development workflow. These tools provide a consistent, isolated, and reproducible development environment. No more "but it works on my machine" excuses. With dev containers, if it works on one machine, it works on all.

In this article, we'll explore what dev containers are, their benefits, and how to set one up. We'll delve into the nuts and bolts of creating and configuring your own dev container, and by the end, you'll be equipped with the knowledge to bring your development environment into the 21st century.

A Lego turtle.

If you've ever heard the phrase "It's turtles all the way down", you'll find it surprisingly relevant here. Especially if you're working on Windows in WSL2 or on a Mac where the dev environments are usually Linux-based, much like the live environment. But more on that later.

Back in the early 2000s, I was part of a team working on Simics, a cycle-accurate full system simulator. Simics was a game-changer in the world of software development. It introduced the ability to run programs backwards, a feature that was nothing short of revolutionary. This meant that developers could step back through their code execution, making it easier to identify and fix bugs.

But that's not all. Simics also made it possible to simulate hardware that didn't even exist yet. This was a significant breakthrough, as it allowed software development to proceed in parallel with hardware development.

For instance, Microsoft was able to develop a 64-bit version of Windows before the 64-bit hardware was even available. This was a significant advantage, as it allowed Microsoft to hit the ground running when the hardware was finally released.

IBM also leveraged Simics to develop a 64-bit version of their operating system OS/2. This parallel development of hardware and software was a significant shift in the industry, and it was all made possible by Simics.

Around the same time, VMWare and VirtualBox was also making waves in the world of virtualization. VirtualBox allowed developers to run multiple operating systems on a single machine, which was a significant step forward in terms of flexibility and efficiency.

VMware's technology was revolutionary at the time. It allowed businesses to partition a physical server into multiple virtual machines, each capable of running its own operating system and applications. This meant that businesses could get more value from their physical servers, reducing costs and improving efficiency.

These tools, Simics, VMware, and VirtualBox, were the precursors to the easy virtualization we enjoy today with Docker. They laid the groundwork for what we now know as dev containers, and their influence can still be seen in the way we develop software today.

Before we get into the how of running dev containers, let's talk about the why. Why should you consider using dev containers? What makes them worth the effort of setting up? Well, let's find out.

Here are the main benefits:

Consistency: The "it works on my machine" syndrome is a developer's nightmare. Dev containers squash this issue. Everyone works with the same environment, eliminating the gap between local and production setups. Moreover, if done right this consistency extends to testing and live environments, ensuring that your application behaves as expected across all stages. At Happi Hacking we are experts in doing this right and we would be happy to help you set things up the right way.

Quick Setup: Setting up a new project or jumping between projects can be time-consuming. With dev containers, you can get up and running in no time. No need to install and configure each dependency manually, the container has it all ready for you.

Simplified Onboarding: New team member? No problem. With dev containers, they can hit the ground running. Pull the container, start coding. It's that simple.

Collaboration: Sharing your work environment is as simple as sharing your container configuration. This facilitates collaboration, whether you're pair programming or debugging.

Dependency Management: Dev containers keep your project and its dependencies in their own neat little box. This means no more conflicts between projects that require different versions of the same dependency.

What is a Dev Container?

A Dev Container, or Development Container, is a virtual environment tailored for software development. It's a self-contained unit that houses your project and all its dependencies. Think of it as a mini-computer, living inside your actual computer, that you can set up to match your project's needs exactly.

Dev Containers work by leveraging containerization technology. If you're familiar with Docker, you're halfway there. Docker allows us to package an application along with its environment into a container. A Dev Container takes this a step further and packages not just the application, but the entire development environment.

This means your Dev Container includes the specific version of the programming language you're using, any libraries or frameworks your project depends on, and even the development tools and extensions you need. Everything is pre-configured and ready to go.

The beauty of this is that a Dev Container is portable. You can share it with your team, ensuring everyone is working in the same environment. You can run it on different machines, knowing it will behave the same way. You can even version control it, so you can go back to a previous setup if needed.

Creating and Optimizing Your Dev Containers: A Practical Guide

Setting up a basic dev container is a straightforward process. You start by installing Docker on your machine.

Installing Docker on Windows (WSL2):

  1. Install Windows Subsystem for Linux (WSL) and upgrade to WSL2. You can follow Microsoft's official guide to do this.
  2. Download Docker Desktop for Windows from Docker's official website and install it. During installation, ensure that you select the option to use WSL2 instead of Hyper-V.
  3. After installation, Docker Desktop will automatically use WSL2.

Installing Docker on Mac:

  1. Download Docker Desktop for Mac from Docker's official website.
  2. Open the Docker.dmg file you downloaded and drag the Docker app to your Applications folder.
  3. Open Docker Desktop from your Applications folder. You'll see a whale icon in your top status bar indicating that Docker is running.

Installing Docker on Linux:

  1. Update your existing list of packages:
sudo apt-get update
  1. Install a few prerequisite packages which let apt use packages over HTTPS:
sudo apt-get install apt-transport-https ca-certificates curl software-properties-common
  1. Add the GPG key for the official Docker repository to your system:
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
  1. Add the Docker repository to APT sources:
sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable"
  1. Update the package database with the Docker packages from the newly added repo:
sudo apt-get update
  1. Make sure you are about to install from the Docker repo instead of the default Ubuntu repo:
apt-cache policy docker-ce
  1. Install Docker:
sudo apt-get install docker-ce

Remember, Docker commands usually require sudo privileges. To avoid typing sudo every time you run a Docker command, add your username to the Docker group:

sudo usermod -aG docker ${USER}

You'll need to log out and log back in for this to take effect.

Creating your first Docker

Once Docker is up and running, you can create a Dockerfile. This file is a text document that contains all the commands a user could call on the command line to assemble an image.

Here's a simple Dockerfile example:

# Use an official Python runtime as a parent image
FROM python:3.7-slim

# Set the working directory in the container to /app
WORKDIR /app

# Add the current directory contents into the container at /app
ADD . /app

# Install any needed packages specified in requirements.txt
RUN pip install --trusted-host pypi.python.org -r requirements.txt

# Make port 80 available to the world outside this container
EXPOSE 80

# Run app.py when the container launches
CMD ["python", "app.py"]

In this example, we're setting up a simple Python application. We specify the parent image we're using (Python 3.7), set the working directory to /app, add our current directory into the container, install the necessary packages, expose the necessary port, and finally, specify what command to run when the container launches.

Dockerfiles are incredibly flexible. You can create a Dockerfile for any application, regardless of the technology stack. This flexibility is one of the main benefits of using Dockerfiles to create your dev containers.

When it comes to best practices for working with dev containers, here are a few tips:

  1. Keep your Dockerfiles lean and efficient. Avoid installing unnecessary packages and clean up after yourself to keep the image size down.

  2. Use .dockerignore files. These work like .gitignore files. They prevent unwanted files from being added to your Docker images.

  3. Build your applications to be environment-agnostic as much as possible. This means minimizing the number of environment-specific configurations you need.

  4. Use environment variables for configuration. This allows you to keep sensitive information out of your Dockerfiles.

  5. Regularly update your images to get the latest security patches. You can automate this process with CI/CD pipelines.

Remember, the goal of using dev containers is to make your development workflow more consistent, isolated, and reproducible. Keep this in mind as you build and work with your dev containers.

For more complex projects where you might want to combine several development environments like C, C++, Erlang, Elixir, JavaScript, and React in one environment this can become somewhat complex. This is where Happi Hacking can be of service. We help you set up efficenr custom made dev environments.

As we'll soon see, this concept can be expanded to bring entire execution environments to the developer's machine using tools like Docker Compose or Minikube.

Pre-built Dev Containers: A Quick Start

Pre-built dev containers are a great way to get started quickly. These are ready-made containers available on Docker Hub or other repositories that come with all the necessary tools and configurations already set up. You can simply pull these containers and start using them for development without having to worry about the setup process.

For instance, if you're developing a Node.js application, you can pull a Node.js dev container that comes with Node.js, npm, and other necessary tools already installed. This can save you a lot of time and effort.

Moreover, at Happi Hacking, we offer a custom Docker repository tailored to your organization's needs. We can provide pre-built dev containers equipped with the tools and configurations that your team uses regularly. This can further streamline your development workflow and ensure consistency across your team.

Visual Studio Code Dev Containers: Seamless Setup and Integration

Visual Studio Code (VS Code) has a feature called Dev Containers that takes the convenience of pre-built dev containers to the next level. This feature allows you to define your development environment as code using a combination of a Dockerfile and a .devcontainer.json configuration file.

Once you've defined your dev container, VS Code can automatically build and run the container, and then open your project inside the container environment. This means you can start coding immediately, with all your tools and dependencies already set up and ready to go.

The real beauty of VS Code Dev Containers is the integration with the VS Code editor. You can use all VS Code features and extensions inside the dev container, just as if you were working locally. This includes IntelliSense code completion, debugging, version control, and more.

Moreover, VS Code Dev Containers support both single-container and multi-container configurations with Docker Compose. However, it's important to note that you can only connect to one container per VS Code window. This means you can set up complex development environments involving multiple services, all running in separate but interconnected containers, but you would need to open a separate VS Code window for each container you want to connect to.

Customizing Your Environment with .devcontainer.json

A devcontainer.json file is a configuration file that defines the development environment for Visual Studio Code when using Dev Containers. This file is typically located in the root of your project, inside a folder named .devcontainer.

Here's a basic example of a devcontainer.json file:

{
    "name": "A Happy Project",
    "image": "erlang-26.0.2.0",
    "forwardPorts": [8080],
}

In this example:

  • "name": This is the name of your development environment. It's displayed in the lower left corner of VS Code when the dev container is active.

  • "image": This is the Docker image that the dev container will use. In this case, it's using a pre-built image for Erlang.

  • "forwardPorts": This is an array of ports that will be forwarded from the dev container to the host machine. In this case, port 8080 is being forwarded, which is common for web development servers.

These are just some basic options. The devcontainer.json file can include many other options for more advanced scenarios, such as mounting volumes, setting environment variables, and even using Docker Compose to define multi-container environments.

Docker Compose: Orchestrating Multi-Container Environments

Docker Compose is a tool that simplifies the process of managing multi-container Docker applications. It allows you to define and run applications consisting of multiple Docker containers using a single, easy-to-read YAML file. This file, typically named docker-compose.yml, describes the services that make up your application so they can be run together in a single environment.

Let's consider a simple example. Suppose you're developing a web application that uses a database. Instead of running the web server and database server separately, you can use Docker Compose to run both services together.

Here's a basic docker-compose.yml file for such a scenario:

version: '3'
services:
  web:
    build: .
    ports:
      - "5000:5000"
  db:
    image: postgres
    volumes:
      - ./data:/var/lib/postgresql/data

In this file, we define two services: web and db. The web service is built using the Dockerfile in the current directory and is mapped to port 5000. The db service uses the postgres image and mounts the ./data directory to a specific path within the container.

To start the application, you would simply run docker-compose up from the directory containing the docker-compose.yml file. Docker Compose takes care of starting the services in the correct order, linking them together, and providing them with the necessary environment variables.

Docker Compose is a powerful tool for managing complex applications with multiple services. It's especially useful in development environments, where you often need to run multiple services together. By using Docker Compose, you can ensure that your development environment closely matches your production environment, reducing the chances of encountering unexpected issues when you deploy your application.

Minikube: A Miniature Kubernetes for Your Local Machine

Minikube is a tool that lets you run Kubernetes, a powerful platform for managing containerized applications, on your local machine. It's designed to be easy to use and is perfect for when you're developing applications that will eventually be deployed on a full-fledged Kubernetes cluster.

Let's revisit our web application and database example, but this time, we'll use Minikube and Kubernetes. The equivalent to a docker-compose.yml file in Kubernetes is a set of configuration files that define Kubernetes resources such as Pods, Services, and Deployments.

Here's a basic Kubernetes Deployment configuration for our web server:

apiVersion: apps/v1
kind: Deployment
metadata:
  name: web
spec:
  replicas: 1
  selector:
    matchLabels:
      app: web
  template:
    metadata:
      labels:
        app: web
    spec:
      containers:
      - name: web
        image: my-web-app:latest
        ports:
        - containerPort: 5000

And here's a Service that makes the web server accessible on port 5000:

apiVersion: v1
kind: Service
metadata:
  name: web
spec:
  type: LoadBalancer
  ports:
  - port: 5000
    targetPort: 5000
  selector:
    app: web

For the database, we could use a similar Deployment and Service configuration, but with the postgres image and the appropriate ports and volumes.

To apply these configurations and start the application, you would use kubectl, the Kubernetes command-line tool, like so: kubectl apply -f web-deployment.yaml, kubectl apply -f web-service.yaml, and so on.

Minikube brings the power of Kubernetes to your local machine, making it easier to develop complex, multi-container applications. It's a bit more involved than Docker Compose, but it offers greater flexibility and is a great way to get hands-on experience with Kubernetes.

Wrapping Up: The Power of Dev Containers

We've covered a lot of ground in this article, but we've only just scratched the surface of what dev containers can do. From providing consistent, reproducible development environments, to simplifying the onboarding process and facilitating collaboration, dev containers offer a host of benefits that can streamline your development workflow.

To recap, here are some of the key advantages of using dev containers:

  1. Consistency: Dev containers ensure that every developer is working in the same environment, eliminating the "but it works on my machine" problem.
  2. Simplicity: With dev containers, setting up a new development environment is as easy as running a few commands. No need to install and configure a bunch of software manually.
  3. Isolation: Dev containers keep your projects isolated from one another, preventing conflicts between different projects' dependencies.
  4. Versatility: Whether you're using Dockerfiles, Docker Compose, or pre-built images from Docker Hub, dev containers offer a variety of ways to set up your environment.
  5. Integration: Tools like Visual Studio Code's Dev Container feature make it easy to work with dev containers, integrating seamlessly with your existing workflow.

If you haven't already, I encourage you to give dev containers a try in your own projects. You might be surprised at how much they can improve your development experience.

And remember, if you need help setting up your dev container environments, Happi Hacking is here to assist. We offer dev container setup as a service, taking the hassle out of getting your environments up and running. Don't hesitate to reach out if you're interested.

Sometimes confinement can be freedom. Embrace the power of dev containers and see what they can do for you? Happy coding!

PS.

Here are some resources for further reading on dev containers:

  1. Developing inside a Container - Visual Studio Code
  2. Dev Containers Tips and Tricks - Visual Studio Code
  3. A curated list of awesome tools and resources about dev containers - GitHub
  4. devcontainers - GitHub
  5. Change Resource Constraints For Dev Containers | DevSpace | Documentation
  6. Beginner's Series to: Dev Containers | Microsoft Learn
  7. Available Dev Container Features - Development containers
  8. Reference Implementation - Development containers
  9. Containers - Resources and Tools - IBM Developer
  10. Runtime options with Memory, CPUs, and GPUs | Docker Documentation
- Happi


Happi Hacking AB
KIVRA: 556912-2707
106 31 Stockholm