proxy.golang.org : github.com/thushan/olla
High-performance lightweight proxy and load balancer for LLM infrastructure. Intelligent routing, automatic failover and unified model discovery across local and remote inference backends.
Registry
-
Source
- Documentation
- JSON
- codemeta.json
purl: pkg:golang/github.com/thushan/olla
Keywords:
ai
, amd
, golang
, intel
, llama-cpp
, llamacpp
, llm-inference
, llm-proxy
, llm-router
, llm-routing
, lmstudio
, mlx
, nvidia
, ollama
, proxy
, self-hosted
, self-hosted-ai
, sglang
, vllm
, vllm-backend
License: Apache-2.0
Latest release: 2 months ago
First release: 8 months ago
Namespace: github.com/thushan
Stars: 105 on GitHub
Forks: 11 on GitHub
See more repository details: repos.ecosyste.ms
Funding links: https://github.com/sponsors/thushan, https://ko-fi.com/thushanfernando
Last synced: about 2 months ago