Ollama v0.23.0 Adds Claude Desktop Support via Ollama Launch
Ollama
Ollama v0.23.0 (May 3) adds Claude Desktop support via Ollama Launch, enabling Claude Cowork and Claude Code to route through local models. Server-driven model recommendations allow featured models to surface without an app update. Fixes a Windows IPv4 loopback timeout affecting gateway integrations and improves Metal initialization for macOS.
Why it matters
Claude Desktop connecting to local Ollama models removes API costs for local development workflows, broadening Ollama's appeal as infrastructure for coding agent toolchains that need to run fully offline or on-premise.
Importance: 2/5
Incremental release but notable for the Claude Desktop local routing capability.