@c0mpute/worker
Native CLI worker for the c0mpute.ai distributed inference network. Runs LLM inference via ollama and connects to the orchestrator via Socket.io.
Ecosystem
npmjs.org
npmjs.org
Latest Release
4 days ago
2.0.0
4 days ago
Versions
21
21
Downloads
1,928 last month
1,928 last month