You've heard about containers and how useful they are for development environments, but haven't taken the plunge yet.
Don't worry, we've got you covered. Containers can revolutionize the way you code by giving you isolated and disposable environments that are ready to code in, straight out of the box. In this ultimate guide, we'll show you everything you need to get started with dev containers.
By the end, you'll be spinning up customized containers for your projects and wondering how you ever lived without them.
Let's dive in and see how dev containers can streamline your workflow and make you a happier, more productive developer.
Dev containers are isolated, lightweight environments that provide a pre-configured development environment inside your editor or IDE. They save time by eliminating the need for manual setup and ensure a clean environment every time. Dev containers offer benefits such as pre-configured build environments, isolated environments, reproducible builds, less setup time, and flexibility in choosing base images.
What Are Dev Containers?
Dev containers are isolated, lightweight environments that allow developers to work inside a containerized version of a build environment. Basically, dev containers give you a pre-configured development environment right inside your editor or IDE.
As a developer, dev containers can save you a ton of time setting up projects and ensure you have a clean environment every time you start working.
Some of the main benefits of using dev containers include:
Pre-configured build environments. Dev containers come with a base image that has all the software, tools, and dependencies pre-installed so you can get started coding right away.
Isolated environments. Each dev container has its own isolated filesystem, networking, memory, and CPU - so there are no conflicts with other projects or software on your local machine.
Reproducible builds. Dev containers provide the exact same environment every time they're launched, so you get the same build results each time. No more "it works on my machine!" issues.
Less setup time. Starting a new project with dev containers means you can skip the lengthy setup and configuration process. Just open your project in the container and everything is ready to go.
Flexibility. You have options to choose a base image with the software and tools you want or build your own custom base image. So dev containers can flexibly meet your specific needs.
Dev containers revolutionize the developer experience by providing pre-configured, isolated environments that can supercharge your productivity. If you haven't tried them yet, you owe it to yourself as a developer to give dev containers a shot.
They may just change the way you work! The future is containerized!
Getting Started With Dev Containers
Getting started with Dev Containers is pretty straightforward. To use them, you'll need:
Docker Desktop installed Dev Containers use Docker to build and run containers. So you'll need Docker Desktop for your OS installed and running.
A devcontainer.json file This file defines your container environment. It specifies things like:
The Docker image you want to use (like node:12-alpine)
Folders to mount into the container
Post-create scripts to run
VS Code has snippets to help you generate a devcontainer.json file for popular tech stacks.
Build and run the container Once you have the devcontainer.json file in your project, you can:
Press F1 and select "Remote-Containers: Rebuild and Reopen in Container" to build the container image and reopen the folder in the container.
VS Code will build the container, install its dependencies, and start the container - all in the background.
Your VS Code window will reload and you'll be working directly in the container with everything set up and ready to go!
Anytime you need to build a fresh instance of the container, just run the "Remote-Containers: Rebuild and Reopen in Container" command again.
Dev Containers make it incredibly easy to get a development environment up and running for any tech stack. No more fighting with local dependencies or "works on my machine" issues. Give Dev Containers a try and turbocharge your development workflow!
Choosing a Base Image
Choosing a base image for your dev container is an important first step. This base image contains the basic Linux operating system and initial tools/settings that your container will build upon.
These images will have the runtime and base packages pre-installed so you can get started coding right away. Some popular options include:
python: For Python development. Comes with Python, Pip, and other basics.
golang: For Go development. Comes with the Go compiler, build tools, and common libraries.
General Purpose Images For a more flexible dev container base, you can choose a general purpose image like:
Ubuntu: A popular Linux distribution with a lightweight footprint. Easy to install any languages/tools on top.
Debian: Another popular, open-source Linux OS. Stable and reliable.
Alpine: A tiny Linux distribution perfect for containers. Only a 5MB image but you can still install whatever you need.
These general images give you more control to fully customize your container's contents. However, it also means more work upfront to get your programming environment set up. It depends if you prefer convenience or flexibility!
In the end, choose an image that has:
The minimum tools/packages you need to get started
A small footprint for fast builds and load times
Active maintenance to keep the image secure and up-to-date
Your dev container's base image establishes a solid foundation. From there, you can install specific tools, sync in source code, and fully configure your development environment. The possibilities are endless!
Adding Tools and Runtimes
Adding tools and runtimes to your dev container gives you a lot of flexibility. You have a few options for how to do this:
Install tools/runtimes when you build the image: This is good if you know exactly what you need for your project ahead of time. You can install tools/runtimes using the Dockerfile with RUN apt-get install .
Install tools/runtimes when you start the container: This is useful if you want to install things at runtime or if your tooling needs change from project to project. You can install tools/runtimes using the devcontainer.json postCreateCommand.
Use devcontainer.json build.args to pass in tools/runtimes as build arguments: This lets you build once but start the container multiple times with different tools and runtimes. You pass the build args when you start the container, and the Dockerfile uses the build args to conditionally install tools/runtimes. You can refer to this documentation for understanding the syntax and usage of build arguments: Docker ARG documentation
Create a devcontainer.json with multiple builds for different stacks: This allows you to pick a “stack” when starting the container and get a set of predetermined tools. You define multiple build objects in devcontainer.json, each with a Dockerfile to install a particular set of tools. The devcontainer.json reference guide explains the available options and settings you can use to customize your development containers.
Install a common set of tools in the Dockerfile, then use install-specific tools in devcontainer.json: This gives you a base set of tools on start, then you can install additional tooling as needed for your particular project in the
Use Docker Compose to start additional “sidecar” containers with other tools/runtimes: This lets you start up other containers with tooling/runtimes that your main dev container can access. You define the sidecar services in a docker-compose.yml file referenced by
devcontainer.json. The Docker Compose documentation explains how to define and manage multi-container applications using Docker Compose YAML files.
Using a combination of these techniques, you can craft a dev container with a robust set of tools for any development needs.
The options are plentiful - you just have to choose what works best for your particular project!
Mounting Source Code
To get started developing in a container, you'll first need to mount your source code as a volume. This allows you to edit your code in your IDE of choice on your host machine, while building and running it in the container. There are a couple ways to mount your source code:
Bind Mounts The easiest approach is to use bind mounts. This mounts a directory on your host machine into the container. Anything you change on the host will be reflected in the container and vice versa. To set up a bind mount, you specify the
-v flag when running your container, like this:
docker run -v /path/on/host:/path/in/container ...
So if your code was in
/home/user/code on your host machine, and you wanted to mount it to
/app in your container, you'd run:
docker run -v /home/user/code:/app ...
Now your container will have your source code in the
/app directory, and any changes you make on either the host or in the container will be mirrored in the other.
Bind mounts are simple but have some downsides, like permissions issues. An alternative is to use a volume driver. Volume drivers handle more advanced storage options for Docker volumes. Some free options are:
Local: The default driver. Uses bind mounts.
CIFS: Allows you to mount Windows file shares.
NFS: For *nix file shares.
OverlayFS: An advanced option with better performance than bind mounts.
To use a volume driver, you specify it after the
-v flag, in the format:
docker run -v :: ...
So to use the overlayFS driver to mount your code, it would be:
docker run -v overlayFS:/path/on/host:/path/in/container ...
Volume drivers open up more advanced options for managing your Docker volumes and avoiding some of the issues with bind mounts. I'd recommend starting with bind mounts for simplicity, but exploring volume drivers as your needs evolve.
Setting Up Container Entrypoints
When setting up Dev Containers, you need to define entrypoints which specify what commands should be run when the container starts up.
Entrypoints allow you to bootstrap your development environment by installing dependencies, setting up your codebase, starting servers, and anything else you need to do to get your project running.
There are a few options for defining entrypoints:
- **Dockerfile ENTRYPOINT** - You can define an ENTRYPOINT in your Dockerfile that will run when the container starts. This is good for simple entrypoints, but isn’t flexible if you need to override it. Here is the [link](https://docs.docker.com/engine/reference/builder/#entrypoint) to doc to learn more.
docker-compose.yml entrypoint - You can define an entrypoint in your docker-compose.yml file. This gives you more flexibility, as you can override it when launching the container. Refer the doc to read further.
Shell script - You can write a shell script, mark it as executable, and use it as your entrypoint. This is a great option if you have a complex entrypoint with conditional logic. You can pass arguments to the shell script when launching the container to control its behavior.
Rebuild container - You can build a new image with a different ENTRYPOINT to redefine your entrypoint. This isn’t ideal, as it requires rebuilding your image. It’s better to use one of the other options for a flexible entrypoint.
A good entrypoint should:
Install any dependencies (npm install, bundle install, apt-get install, etc.)
Set up your codebase (migrate databases, compile assets, etc.)
Start any required servers (npm start, rails s, etc.)
Run in the foreground and tail the logs
Pass through any signals so Ctrl+C will stop the container
Defining a solid container entrypoint is key to having a smooth dev container experience. Put in the effort to get it right, and your dev container will start up with everything you need to get coding!
You'll be able to dive right into your work without having to deal with installation or configuration.
Dev Container FAQs: Common Questions Answered
Dev Containers have tons of useful applications for developers. Here are a few of the major use cases:
Local Development Environments: Quickly spin up a ready-to-code local environment for your project. No more installing dependencies and setting up your workspace manually.
Onboarding New Team Members: Get new developers on your project up and running in no time. Simply have them open the Dev Container and they'll have a fully configured environment.
Isolated Environments: Run your development environment in an isolated container separate from the rest of your local setup. Great for testing dependencies and tooling upgrades risk-free.
Reproducible Setups: Commit your Dev Container configuration with your code so anyone else can spin up an identical environment. No more "it works on my machine!" issues.
Tooling Experiments: Try out new tools, languages, and workflows in a contained environment without impacting your local setup. If it doesn't work out, simply remove the container.
CI/CD Environments: Use the same Dev Container configuration to build and test your code in your continuous integration and deployment pipelines.
Do I need Docker installed?
Yes, Dev Containers utilize Docker containerization technology. You'll need Docker Community Edition (CE) or higher installed.
Do Dev Containers replace my local development environment?
No, Dev Containers run within your local environment and development tools. They simply contain the runtimes and dependencies for your project.
Can I commit and push from within a Dev Container?
Yes, Dev Containers mount your local source code into the container. You can freely commit, push, pull, and work with Git as needed.
Do I need an internet connection to use Dev Containers?
Dev Containers do require an internet connection the first time they're built to pull dependencies. After the initial build, no internet connection is needed to use a Dev Container. However, if your project has an npm install or similar, an internet connection would be needed for those package installations.
Are Dev Containers platform agnostic?
Yes, Dev Containers can be used on Windows, macOS, and Linux since they utilize Docker containerization. The experience may slightly differ between operating systems but the end result is the same.
Can I use any code editor/IDE with Dev Containers?
Dev Containers work with any editor that supports the Remote - Containers extension including VS Code, Visual Studio, Atom, Sublime Text, and more.
Dev Containers Best Practices
Keep your dev container lightweight The more you pack into your dev container, the larger its image size grows. This can impact build and startup times. Only include tools, packages and software specifically needed for your project. Don’t install extras “just in case” you might need them someday. You can always add more later if needed.
You can refer to this guide about creating a lightweight windows container environment without docker or this on creating creating lightweight container in 5 steps.
Use multi-stage builds Multi-stage Docker builds allow you to separate your dev environment from artifacts you want to ship. You can have one stage for your dev environment, and another to build your application. The second stage can copy only the built artifact from the first stage, keeping your final image lean.
Cache dependencies Caching dependencies and packages between builds can significantly speed up dev container startup times. Docker’s cache directive was made for this. Add a line like
RUN npm install --cache /tmp/cache to your Dockerfile, and Docker will reuse the cached /tmp/cache directory on subsequent builds.
Keep containers ephemeral Treat your dev containers as ephemeral. Don’t store any important files or data on the container itself. Commit all your source code and work to your host machine or a Docker volume. This allows you to freely rebuild your container image without worrying about losing work.
Use Docker Compose Docker Compose allows you to define multiple services - like a database container, cache container and your dev container - in a single YAML file. With Docker Compose, you can spin up your entire dev environment with a single command. No need to start containers individually and wire them up.
Refer this official guide by Containers.dev using docker compse.
Bind to host networking For the easiest development experience, bind your dev container to your host machine's network. This will give the container access to services on your host like databases or API servers. It will appear as just another process on your network.
Keep images up to date As with any software, the components in your dev container like Ubuntu, Node, etc release updates to patch security issues and bugs. It's a good practice to periodically rebuild your dev container image to pull in the latest updates. This helps keep your environment secure and working as expected.
Dev Containers aim to simplify development environments and boost productivity. Give them a try and see how they can improve your workflow!
So there you have it, everything you need to know about Dev Containers to make your life as a developer so much easier.
With the power of Docker and Visual Studio Code, you now have a portable, isolated development environment that can replicate your production environment locally. No more "it works on my machine!" excuses.
You'll be coding with confidence and shipping better software in no time. What are you waiting for? Spin up a Dev Container, install your dependencies, and get to work. Your future self will thank you. Happy coding!