An open API service providing package, version and dependency metadata of many open source software ecosystems and registries.

Top 3.9% on pypi.org
Top 2.7% downloads on pypi.org
Top 4.8% dependent packages on pypi.org
Top 5.2% dependent repos on pypi.org
Top 3.2% docker downloads on pypi.org

pypi.org : openllm-client

OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server.

Registry - Source - Homepage - Documentation - JSON
purl: pkg:pypi/openllm-client
Keywords: AI , Alpaca , BentoML , Falcon , Fine tuning , Generative AI , LLMOps , Large Language Model , Llama 2 , MLOps , Model Deployment , Model Serving , PyTorch , Serverless , StableLM , Transformers , Vicuna , ai , bentoml , falcon , fine-tuning , llama , llama2 , llm , llm-inference , llm-ops , llm-serving , llmops , mistral , ml , mlops , model-inference , mpt , open-source-llm , openllm , stablelm , vicuna
License: Apache-2.0
Latest release: 10 months ago
First release: over 1 year ago
Dependent packages: 2
Dependent repositories: 8
Downloads: 3,080 last month
Stars: 8,657 on GitHub
Forks: 541 on GitHub
Docker dependents: 1
Docker downloads: 47
See more repository details: repos.ecosyste.ms
Last synced: 2 days ago

aar0npham
10 packages
138,929 downloads