ollama launch command
The primary command to launch a model and enter an interactive session is ollama run <model_name>. To launch the backend server/daemon, use ollama serve.
Key Updates & Capabilities
- Anthropic API Compatibility: ollama now supports the Anthropic API specification, allowing local models to interface with tools designed for claude.
- Local Claude Code Execution: Enables running claude-code locally by routing Anthropic-formatted requests to a local ollama instance.
- Model Implementation: Demonstrates successful compatibility using GLM-4.7-Flash (a 30B parameter Mixture-of-Experts model with 3B active parameters).
Related Concepts
- claude-code
- Anthropic API
- GLM-4.7-Flash
- LLM API Compatibility
Sources
- 2026 04 14 Ollama Claude GLM Channel Sam Witteveen
Source Notes
- 2026-04-23: https://www.youtube.com/watch?v=NA5U06WuO34 Here is a Markdown summary and guide based on the video content. # Running Claude Code Locally with Ollama and GLM-4.7-Flash This guide covers how to use the new Anthropic API compatibility in Ollama to run Claude Code locally usi (Running Claude Code Locally with Ollama and GLM-4.7-Flash)