Ron Claude code locally - Mervin Praison channel
https://www.youtube.com/watch?v=kRS7DSDzo-c Here is a Markdown summary and step-by-step guide based on the video transcript.
How to Run Claude Code Locally for Free (Using Ollama)
Claude Code is a powerful agentic coding tool that can build applications, fix errors, and refactor code. However, the official API (e.g., Opus 4.6) can be expensive (25 output per million tokens). This guide explains how to run Claude Code locally using Ollama to keep your data private and avoid API costs.
Prerequisites
- A computer capable of running local LLMs.
- Terminal access.
Step-by-Step Installation
1. Set up Ollama
- Download and install Ollama from ollama.com.
- Open your terminal and download a model. The video recommends
gpt-oss:20b, but you can use others likeqwen3-coderormistral.ollama pull gpt-oss:20b - (Optional) Check your downloaded models:ollama list
2. Install Claude Code
Install the Claude Code tool using the official curl command:
curl -fsSL https://claude.ai/install.sh | bash
3. Configure and Launch
Instead of just launching, use the config flag to select your local model:
- Run the launch command with configuration:ollama launch claude —config
- A list of your local Ollama models will appear. Select the model you wish to use (e.g.,
gpt-oss:20b). - Confirm launch when prompted.
Usage Example
Once Claude Code is running in your terminal, you can issue natural language commands.
- Prompt: “Create a one-page website with basic information about AI Agency.”
- Process: Claude Code will generate the file structure (e.g.,
index.html) and write the code. - Interaction: You will be asked to confirm edits. Press
Enter(Yes) orShift+Tab(Allow all edits in session) to proceed.
Note: Local models are best suited for basic to intermediate tasks compared to the paid Claude Opus models.
Recommended Models
The video documentation recommends the following models for the best experience with Claude Code:
qwen3-coderglm-4.7gpt-oss:20bgpt-oss:120b(requires high VRAM)
Troubleshooting & Settings
If you encounter errors, verify your configuration settings.
- Check Status: Inside the Claude Code interface, type:
/status
- Ensure Anthropic base URL is:
http://localhost:11434 - Ensure Model matches your downloaded Ollama model.
- Ensure Anthropic base URL is:
- Manual Configuration: If needed, you can manually edit the settings file located at
~/.claude/settings.json. Ensure the JSON includes: { “env”: { “ANTHROPIC_AUTH_TOKEN”: “ollama”, “ANTHROPIC_API_KEY”: “ollama”, “ANTHROPIC_BASE_URL”: “http://localhost:11434” }}