
OpenClaw is as powerful as the models you give it. There are plenty of proprietary models but many of us prefer open ones. The latest update makes it much simpler to use open models with OpenClaw. You simply have to use the command ollama launch openclaw to get started. It comes pre-configured with models like Kimi-K2.5, GLM-5, and Minimax M2.5. You also get web search when using cloud models.
Ollama 0.17 makes it much simpler to use open models with @openclaw
Try it with:
ollama launch openclaw
Tutorial post in 🧵 pic.twitter.com/LooBh9Cfd4
— ollama (@ollama) February 24, 2026
You are going to need Node.js, Mac, Linux, or WSL. With “ollama launch openclaw –model kimi-k2.5:cloud” command, you can get Kimi running. Ollama will detect and installs OpenClaw if it is not installed on your system.
[HT]

