proxy.golang.org : github.com/softmaxer/batchollama : v0.0.5
Run Ollama inference in batches, where each batch executes as prompts concurrently.
Registry -
Documentation -
Download -
JSON
purl: pkg:golang/github.com/softmaxer/[email protected]
Published:
Indexed:
Loading...
Readme
Loading...
Dependencies
- github.com/jmorganca/ollama v0.1.28