Ollama v0.24.0: Codex App Integration and MLX Sampler Improvements
Ollama
Ollama v0.24.0 introduces built-in Codex App integration with browser and review mode capabilities. The MLX sampler was refined for improved generation quality on Apple Silicon. Earlier v0.23.x releases added vision model support in `ollama launch opencode` and fixed Claude tool result formatting.
Why it matters
Tighter Codex integration connects the local inference stack with OpenAI's coding agent ecosystem, enabling hybrid local/remote workflows for Apple Silicon users.
Importance: 2/5
Minor version with meaningful Codex + MLX integration improvements
Sources
official
Release v0.24.0 — ollama/ollama