Posted in

How to Set Up Docker for Local Development: A Complete Guide

How to Set Up Docker for Local Development: A Complete Guide

In the modern software development landscape, Docker has become a cornerstone for creating consistent, portable, and reproducible development environments. Whether you’re a solo developer, part of a growing team, or contributing to open-source projects, setting up Docker for local development can streamline your workflow, eliminate the “works on my machine” problem, and enhance productivity.

This guide walks you through everything you need to set up Docker for local development, from installation to managing containers and volumes, with examples and best practices.

What Is Docker, and Why Use It?

Docker is an open-source platform that enables you to automate the deployment of applications inside lightweight, portable containers. These containers package your application with all its dependencies, ensuring it runs the same in any environment—whether it’s development, staging, or production.

Benefits of Docker in Local Development:

  • Environment consistency across machines and teams
  • Easy onboarding of developers
  • Isolation of dependencies
  • Simpler DevOps and CI/CD integration
  • Scalable architecture through services like Docker Compose and Kubernetes

Prerequisites

Before you begin, make sure you have the following:

  • A computer running Windows, macOS, or Linux
  • Admin rights to install software
  • Familiarity with the command line
  • Basic knowledge of development environments and programming languages

Step 1: Installing Docker

On Windows & macOS:

  1. Download Docker Desktop from the official website: https://www.docker.com/products/docker-desktop
  2. Follow the installation instructions.
  3. After installation, start Docker Desktop and ensure it’s running.

Tip: On Windows, make sure WSL 2 is installed and configured for better performance.

On Linux:

For Ubuntu/Debian:

sudo apt update
sudo apt install \
ca-certificates \
curl \
gnupg \
lsb-release

# Add Docker’s official GPG key
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg

# Set up the stable repository
echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null

# Install Docker
sudo apt update
sudo apt install docker-ce docker-ce-cli containerd.io

# Post-install steps
sudo usermod -aG docker $USER

Restart your system or log out and log back in.

Step 2: Verifying the Installation

Run this command in your terminal:

docker --version

You should see output like:

Docker version 25.0.3, build deadbeef

To test Docker functionality:

docker run hello-world

This pulls a test image and runs it in a container. If it works, your Docker is ready.

Step 3: Setting Up Your First Project

Let’s walk through setting up a basic web application (e.g., Node.js) using Docker.

1. Create Your Project Structure

mkdir docker-node-app
cd docker-node-app

Inside this directory:

touch app.js Dockerfile .dockerignore package.json

2. Add Sample App Code

app.js

const http = require('http');

const server = http.createServer((req, res) => {
res.end('Hello from Docker!');
});

server.listen(3000, () => {
console.log('Server running on http://localhost:3000');
});

package.json

{
"name": "docker-node-app",
"version": "1.0.0",
"main": "app.js",
"scripts": {
"start": "node app.js"
}
}

Dockerfile

# Use Node.js official image
FROM node:20

# Set working directory
WORKDIR /app

# Copy files
COPY package*.json ./
RUN npm install
COPY . .

# Expose port and start app
EXPOSE 3000
CMD ["npm", "start"]

.dockerignore

node_modules
npm-debug.log

Step 4: Build and Run Your Docker Container

Build the Image

docker build -t docker-node-app .

Run the Container

docker run -p 3000:3000 docker-node-app

Open your browser and go to http://localhost:3000. You should see the message:
“Hello from Docker!”

Step 5: Use Docker Compose for Multi-Service Projects

For larger projects (e.g., API + database), you’ll need multiple containers. That’s where Docker Compose comes in.

docker-compose.yml

version: '3'
services:
web:
build: .
ports:
- "3000:3000"
volumes:
- .:/app
- /app/node_modules
command: npm start
db:
image: mongo
ports:
- "27017:27017"

Now run:

docker-compose up

This will spin up both the web app and MongoDB containers.

Step 6: Persisting Data with Volumes

If your app stores data (e.g., a database), you’ll want to persist it even if the container is deleted.

In your docker-compose.yml, define volumes:

  db:
image: mongo
ports:
- "27017:27017"
volumes:
- mongo-data:/data/db

volumes:
mongo-data:

Now MongoDB data will persist locally in a named volume.

Step 7: Cleaning Up

  • Stop containers:
    docker stop <container_id> or docker-compose down
  • Remove unused containers/images/volumes: bashCopyEditdocker system prune docker volume prune

Best Practices for Local Docker Development

  1. Use .dockerignore to reduce image size.
  2. Keep containers lightweight — install only necessary packages.
  3. Use volumes for local development to reflect code changes instantly.
  4. Use environment variables for configuration (via .env files).
  5. Don’t run containers as root inside the image unless necessary.
  6. Tag your images for versioning (myapp:1.0, myapp:dev).
  7. Automate with Makefiles or shell scripts for consistency.

Real-World Tools to Supercharge Docker Development

  • Docker Compose – Manage multi-container setups
  • VS Code + Remote – Containers Extension – Develop inside containers
  • Portainer – Visual Docker container management
  • ngrok – Expose your Docker apps to the internet
  • watchtower – Automatically update running containers

Common Issues and Fixes

IssueSolution
“Permission denied” on LinuxAdd user to Docker group (sudo usermod -aG docker $USER)
Container not reflecting code changesUse volumes in Docker Compose
Port already in useChange ports value in docker-compose.yml
File not found in imageCheck COPY paths in Dockerfile

Conclusion

Setting up Docker for local development might seem like a lot at first, but once you do it, your workflow becomes more predictable, portable, and production-ready. Whether you’re working on solo projects or part of a larger team, Docker helps you build once and run anywhere.

Further Reading