Best AI News — Updated Every 3 Hours
Story Page
← All Stories
Home Community Story
Community

Open source load balancer for Ollama instances

Via r/LocalLlama
Wednesday, Mar 25, 2026 · 2:46PM
Summary

We (the OpenZiti team) built an OpenAI-compatible gateway that, among other things, distributes requests across multiple Ollama instances with weighted round-robin, background health checks, and automatic failover. The use case: You have Ollama running on a few different machines. You want a single

Continue reading the full article
Read at r/LocalLlama
www.reddit.com
Back to all stories