Back to Blog
BlogApril 22, 20264

How to Remove Ollama Models: Complete Step-by-Step Guide

How to Remove Ollama Models: Complete Step-by-Step Guide

Prerequisites

Before you start, make sure you have:

  • Ollama installed and running — Open your terminal and run ollama --version to confirm.
  • Terminal access — Use Terminal (macOS/Linux) or PowerShell/Command Prompt (Windows). Run as Administrator on Windows if you see permission errors.
  • Model names ready — You’ll need the exact name and tag (e.g., llama3:latest).
  • Disk space awareness — Models can be several GB; removing them frees storage immediately after the command completes.

Ollama stores models by default in:

  • macOS: ~/.ollama/models
  • Linux: ~/.ollama/models (or /usr/share/ollama/.ollama/models for system installs)
  • Windows: C:\Users\YourUsername\.ollama\models

Step 1: List All Installed Models

Always start here so you know exactly what you’re removing.

In your terminal, run:

ollama list

Expected output (example):

NAME                ID              SIZE      MODIFIED
llama3:latest       8f6e2a123abc    4.7 GB    3 days ago
gemma2:9b          2d9f3b456def    5.2 GB    1 week ago

Copy the exact name from the NAME column (including the tag after the colon).

Step 2: Remove a Single Model

Use the ollama rm command followed by the model name.

ollama rm llama3:latest

Expected output:

deleted 'llama3:latest'

The command instantly deletes the manifest and all associated blob files. No confirmation prompt is shown.

Step 3: Remove Multiple Models at Once

Remove several models in one line:

ollama rm llama3:latest gemma2:9b mistral:latest

Expected output:

deleted 'llama3:latest'
deleted 'gemma2:9b'
deleted 'mistral:latest'

Step 4: Remove All Models (Bulk Delete)

Use this one-liner to delete every model at once (macOS/Linux first):

ollama list | awk 'NR>1 {print $1}' | xargs -I {} ollama rm {}

Windows PowerShell equivalent:

ollama list | Select-Object -Skip 1 | ForEach-Object { $_.Trim() -split '\s+' | Select-Object -First 1 } | ForEach-Object { ollama rm $_ }

After running, confirm with ollama list — it should return only the header row.

Restart the Ollama service to ensure all blobs are fully cleaned up and disk space is freed.

  • macOS: ollama serve (or quit and reopen the Ollama app)
  • Linux: sudo systemctl restart ollama
  • Windows: Restart the Ollama process from Task Manager or simply close and reopen the app.

Run ollama list again to verify the models are gone.

Common Issues & Troubleshooting

  • “Model not found” error: Double-check the exact name and tag from ollama list.
  • Permission denied: On Linux/macOS use sudo ollama rm ... or run terminal as Administrator on Windows.
  • Disk space not freed: Restart Ollama (Step 5). If still stuck, check the blobs folder manually but never delete files while Ollama is running.
  • Model still appears after rm: Stop any running instance with ollama stop <model> first, then retry ollama rm.
  • Custom storage location: If you set OLLAMA_MODELS environment variable, the models live elsewhere — check that path instead.
  • API method (advanced): Use curl for scripts:
    curl -X DELETE http://localhost:11434/api/delete -d '{"model": "llama3:latest"}'
    

Next Steps

  • Download fresh models: Run ollama pull llama3.1:8b to get the latest version.
  • Set custom storage: Export OLLAMA_MODELS=/path/to/new/folder before running Ollama to move everything to a larger drive.
  • Automate cleanup: Add the bulk delete command to a weekly cron job or PowerShell script.
  • Monitor usage: Run ollama list regularly and combine with du -sh ~/.ollama/models (macOS/Linux) or File Explorer on Windows to track storage.

You now have full control over your local Ollama models. Free up space instantly and keep only what you need.

Share this article

Referenced Tools

Browse entries that are adjacent to the topics covered in this article.

Explore directory