11 min read

Developer's Guide to Reclaiming Disk Space — Cleaning node_modules, Docker, WSL2, and Caches

Tutorial Docker Node.js Development Environment Performance WSL

Introduction

“Running out of space again…”

Every developer knows this frustration. Before you know it, your SSD has only a few GB left, builds start failing, and you can’t clone new projects.

The main culprits are:

  • node_modules: Hundreds of MB to several GB per project
  • Docker images & cache: Can easily balloon to tens of GB
  • WSL2 virtual disks: Auto-expand but never shrink — the hidden space hog
  • Various caches: npm, pip, Homebrew, IDE…

This article covers practical techniques that can help you reclaim tens of gigabytes in your development environment.


Assess the Situation: What’s Eating Your Space?

First, identify what’s consuming your disk space.

Mac / Linux

# Check disk usage in home directory
du -sh ~/* 2>/dev/null | sort -hr | head -20

# Include hidden folders
du -sh ~/.* 2>/dev/null | sort -hr | head -10

Windows (PowerShell)

# Check user folder sizes
Get-ChildItem $env:USERPROFILE -Directory |
  ForEach-Object {
    $size = (Get-ChildItem $_.FullName -Recurse -Force -ErrorAction SilentlyContinue |
      Measure-Object -Property Length -Sum).Sum / 1GB
    [PSCustomObject]@{Name=$_.Name; SizeGB=[math]::Round($size,2)}
  } | Sort-Object SizeGB -Descending

GUI Tools

If you prefer visual tools to analyze disk usage:

OSToolFeatures
WindowsWinDirStat, TreeSizeTreemap visualization
MacGrandPerspective, DaisyDiskBeautiful UI, intuitive
LinuxBaobab (GNOME Disk Analyzer)Often pre-installed

Reducing node_modules Size

The Scale of the Problem

A typical frontend project’s node_modules can easily reach 300MB to 1GB. With 10 projects, that’s several GB to 10GB+ just from dependencies.

Solution 1: Delete node_modules from Inactive Projects

Remove node_modules from projects you’re not actively working on. Just run npm install when you need them again.

# Find all node_modules in a directory
find ~/projects -name "node_modules" -type d -prune

# Show sizes too
find ~/projects -name "node_modules" -type d -prune -exec du -sh {} \;

# Delete all (dangerous! verify first)
find ~/projects -name "node_modules" -type d -prune -exec rm -rf {} \;

Solution 2: Switch to pnpm

pnpm uses a global store with symlinks. Even if multiple projects use the same package, only one copy exists on disk.

# Install pnpm
npm install -g pnpm

# Migrate existing project to pnpm
cd your-project
rm -rf node_modules package-lock.json
pnpm install

Impact: In monorepo or multi-project environments, 50%+ disk space reduction has been reported.

Solution 3: Clear npm Cache

npm caches downloaded packages, which can grow to 500MB to 2GB.

# Check cache size
npm cache ls 2>/dev/null | wc -l
du -sh ~/.npm

# Clear cache
npm cache clean --force

Reducing Docker Disk Usage

Docker is one of the most storage-hungry tools. Left unchecked, it can consume tens to over 100GB.

Check Current Usage

# Check Docker disk usage
docker system df

# Detailed view
docker system df -v

Example output:

TYPE            TOTAL     ACTIVE    SIZE      RECLAIMABLE
Images          45        5         12.5GB    8.2GB (65%)
Containers      8         2         1.2GB     850MB (70%)
Local Volumes   12        3         4.5GB     3.1GB (68%)
Build Cache     -         -         8.3GB     8.3GB

Solution 1: Bulk Delete Unused Resources

# Safe bulk delete (stopped containers, unused networks, dangling images, build cache)
docker system prune

# More aggressive (also removes unused images)
docker system prune -a

# Delete everything including volumes (data loss warning!)
docker system prune -a --volumes

Recommendation: Run docker system prune about once a week in development environments.

Solution 2: Clear Build Cache

# Delete build cache only
docker builder prune

# Delete only old cache (older than 24 hours)
docker builder prune --filter "until=24h"

Compacting WSL2 Virtual Disk (Windows)

On Windows with WSL2 (including Docker Desktop), the virtual disk (ext4.vhdx) expands automatically but never shrinks when you delete files. This is a major cause of disk space consumption.

Why Doesn’t It Shrink?

WSL2 manages the Linux filesystem in a Virtual Hard Disk (VHD) format. The VHD automatically expands as files grow, but when you delete files, the VHD size remains unchanged. A bloated ext4.vhdx will stay that size until you manually compact it.

Locate Your VHDX File

WSL2 virtual disks are stored in different locations depending on the installation method and version. Check the following patterns:

Pattern 1: Microsoft Store Version (Traditional)

C:\Users\<username>\AppData\Local\Packages\CanonicalGroupLimited.Ubuntu<version>_<ID>\LocalState\ext4.vhdx

Pattern 2: New WSL2 Store Format (GUID Folder)

Recent WSL2 versions may store files in GUID folders:

C:\Users\<username>\AppData\Local\wsl\{xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx}\ext4.vhdx

Pattern 3: Docker Desktop

C:\Users\<username>\AppData\Local\Docker\wsl\data\ext4.vhdx
C:\Users\<username>\AppData\Local\Docker\wsl\distro\ext4.vhdx

Pattern 4: Newer Docker Desktop

C:\Users\<username>\AppData\Local\DockerDesktop\vm-data\ext4.vhdx

Finding VHDX Files Reliably

If you’re unsure of the location, search the entire Windows system with PowerShell:

# Search AppData\Local (fast)
Get-ChildItem -Path "C:\Users\$env:USERNAME\AppData\Local" -Recurse -Filter "*.vhdx" -ErrorAction SilentlyContinue

# Search entire C drive if not found (takes about 5 minutes)
Get-ChildItem -Path "C:\" -Recurse -Filter "*.vhdx" -ErrorAction SilentlyContinue

Or retrieve the WSL distribution path from the registry:

# Get path by specifying distribution name
(Get-ChildItem -Path HKCU:\Software\Microsoft\Windows\CurrentVersion\Lxss |
  Where-Object { $_.GetValue("DistributionName") -eq 'Ubuntu' }).GetValue("BasePath") + "\ext4.vhdx"

Accessing GUID Folders

In PowerShell, folder names containing {} are treated specially, so you must enclose them in quotes:

# Wrong (will error)
cd C:\Users\taiko\AppData\Local\wsl\8ff0ed39-1e39-48df-99c6-3f9267ac2022

# Correct (enclose in quotes)
cd "C:\Users\taiko\AppData\Local\wsl\{8ff0ed39-1e39-48df-99c6-3f9267ac2022}"

# List contents
ls

Check VHDX File Size

# Display size in GB
(Get-Item "C:\Users\<username>\AppData\Local\wsl\{GUID}\ext4.vhdx").Length / 1GB

If it’s 75GB or more, the file is significantly bloated.

Step 1: Pre-compression Cleanup (Important)

Delete unnecessary data in both Windows and WSL before compacting to maximize space recovery. In my experience, thorough preparation significantly improves compression results.

Windows temporary files cleanup:

# Delete Windows temporary files (run as Administrator)
Remove-Item -Recurse -Force C:\Users\$env:USERNAME\AppData\Local\Temp\*

WSL cleanup:

# Remove unused Docker resources
docker container prune -f
docker image prune -a -f
docker volume prune -f
docker builder prune -a -f
docker system prune -a --volumes -f

# Clear APT cache (Ubuntu/Debian)
sudo apt clean
sudo apt autoclean
sudo apt autoremove -y

# Clear system logs (older than 7 days)
sudo journalctl --vacuum-time=7d

# Remove temporary files
sudo rm -rf /tmp/*
sudo rm -rf /var/tmp/*

Running fstrim inside WSL2 releases unused blocks, significantly improving compression effectiveness.

# Release unused blocks (recommended before compression)
sudo fstrim /

One user reported that after running fstrim before compression, their virtual disk shrank from 85GB to 60GB. This simple step can make a huge difference in compression results.

Step 2: Stop All Services

Before compacting, you must completely stop WSL and Docker.

# Exit Docker Desktop (from system tray)

# Shutdown WSL
wsl --shutdown

# Verify complete shutdown (should show nothing)
wsl --list --running

Step 3-A: Compact with Optimize-VHD (Windows Pro / Enterprise)

For Windows Pro or higher, the Hyper-V Optimize-VHD command is the easiest method.

Prerequisites: Enable Hyper-V

  1. Open “Settings” → “Apps” → “Optional features” → “More Windows features”
  2. Check both of these:
    • Hyper-V Management Tools
    • Hyper-V Platform
  3. Restart your PC

Run the compression:

# Run PowerShell as Administrator

# Compact Docker Desktop VHD
Optimize-VHD -Path "C:\Users\<username>\AppData\Local\Docker\wsl\data\ext4.vhdx" -Mode Full

# Compact WSL Ubuntu VHD
Optimize-VHD -Path "C:\Users\<username>\AppData\Local\Packages\CanonicalGroupLimited.Ubuntu22.04LTS_xxxxx\LocalState\ext4.vhdx" -Mode Full

Step 3-B: Compact with diskpart (Windows Home)

Windows Home doesn’t have Hyper-V, so use the diskpart command instead.

# Run Command Prompt or PowerShell as Administrator
diskpart

Enter the following commands in sequence:

select vdisk file="C:\Users\<username>\AppData\Local\Docker\wsl\data\ext4.vhdx"
attach vdisk readonly
compact vdisk
detach vdisk
exit

The compact vdisk command may take 20-30 minutes or more. There’s no progress indicator, so wait patiently for completion.

Step 3-C: Automated Tool “WSL2 Compacter”

If you use multiple WSL distributions, WSL2 Compacter is helpful. It automatically detects and compacts all vhdx files on your system.

# Download and run the script (as Administrator)
.\compact-wsl2-disk.ps1

Real-World Compression Results

Here are actual compression results. The first row is my personal experience:

ScenarioBeforeAfterSavings
Author’s environment (WSL Ubuntu)75GB52GB23GB
Docker + accumulated build cache100GB22GB78GB
Dev environment after 1 year70GB25GB45GB
After docker system prune + compact50GB15GB35GB
After running fstrim + compact85GB60GB25GB

Compacting both WSL and Docker Desktop can recover 30GB to 100GB+ in total.

Common Issues and Solutions

Cannot Access GUID Folders in PowerShell

Paths containing {} are treated specially in PowerShell. Always wrap them in quotes:

# WRONG: Without braces or quotes causes errors
cd C:\Users\taiko\AppData\Local\wsl\8ff0ed39-1e39-48df-99c6-3f9267ac2022

# CORRECT: Wrap the entire path including braces in quotes
cd "C:\Users\taiko\AppData\Local\wsl\{8ff0ed39-1e39-48df-99c6-3f9267ac2022}"

Optimize-VHD Command Not Found

Windows Home edition doesn’t have Hyper-V, so the Optimize-VHD command is unavailable. Use diskpart instead (see Step 3-B).

Compression Has Little Effect

Before compacting, verify:

  1. Have you run docker system prune -a --volumes -f?
  2. Have you run sudo fstrim /?
  3. Is WSL completely shut down? (Check with wsl --list --running)

Important Notes

  • Always backup first: Copy your ext4.vhdx file in case of corruption
  • Stop all WSL-related apps: Including Docker Desktop, VS Code Remote WSL, etc.
  • Don’t interrupt the process: Canceling compact vdisk mid-operation can cause corruption
  • Run regularly: Monthly execution is recommended

Clearing Other Caches

Homebrew (Mac)

# Clear cache
brew cleanup

# Also remove old versions
brew cleanup -s

# Preview what will be removed (dry-run)
brew cleanup -n

pip (Python)

# Check cache location
pip cache dir

# Clear cache
pip cache purge

IDE & Editors

ToolCache Location
VS Code~/.vscode/extensions (remove unused extensions)
JetBrains~/Library/Caches/JetBrains (Mac)
Xcode~/Library/Developer/Xcode/DerivedData

Git

# Garbage collection for repository
git gc --aggressive --prune=now

# Remove stale remote branch references
git remote prune origin

Automate Your Cleanup

Manual cleanup is tedious. Set up automated periodic cleanup.

Shell Script Example

#!/bin/bash
# ~/scripts/cleanup-dev.sh

echo "🧹 Starting development environment cleanup..."

# npm cache
echo "📦 Clearing npm cache..."
npm cache clean --force 2>/dev/null

# Docker (only if running)
if command -v docker &> /dev/null && docker info &> /dev/null; then
    echo "🐳 Removing unused Docker resources..."
    docker system prune -f
    docker builder prune -f
fi

# Homebrew (Mac only)
if command -v brew &> /dev/null; then
    echo "🍺 Clearing Homebrew cache..."
    brew cleanup -s
fi

echo "✅ Cleanup complete!"

Schedule with cron (Linux/Mac)

# Run every Sunday at 3 AM
0 3 * * 0 ~/scripts/cleanup-dev.sh >> ~/logs/cleanup.log 2>&1

Summary

Effective disk space reclamation focuses on these four areas:

TargetPotential SavingsFrequency
WSL2 Virtual Disk30GB - 100GB+Monthly
Docker10GB - 50GB+Weekly
node_modules5GB - 20GBWhen organizing projects
Various Caches1GB - 5GBMonthly

For Windows users, WSL2 virtual disk compression often provides the biggest savings. Both Docker and WSL2 tend to grow without limit, so regular maintenance should become a habit.

Having disk space headroom means faster builds, smoother project setup, and a better overall development experience. Start practicing these techniques today.


References