ollama added anthropic api compatibility last month. point claude code at localhost instead of anthropic’s servers, run a local model, done. my setup:
- ollama with qwen3-coder-32b
- set ANTHROPIC_BASE_URL to localhost:11434
- that’s it what surprised me:
- the agentic stuff (planning, file navigation, multi-step edits) still works — that’s claude code’s logic, not the model
- qwen3-coder handles most tasks fine, struggles on complex refactors
- no rate limits, no waiting, runs offline what doesn’t work as well:
- not opus-level on hard problems
- smaller context window means more back and forth on big codebases
- occasional weird formatting in tool calls anyone else running this? curious what local models people are pairing with it. submitted by /u/nihal_was_here
Originally posted by u/nihal_was_here on r/ClaudeCode
You must log in or # to comment.
