pypi.org : llm_atc
Tools for fine tuning and serving LLMs
Registry
-
Source
- Documentation
- JSON
purl: pkg:pypi/llm-atc
Keywords:
amd
, cuda
, gpt
, inference
, inferentia
, llama
, llm
, llm-serving
, llmops
, mlops
, model-serving
, pytorch
, rocm
, tpu
, trainium
, transformer
, xpu
License: Apache-2.0
Latest release: over 1 year ago
First release: over 1 year ago
Downloads: 238 last month
Stars: 25,904 on GitHub
Forks: 3,780 on GitHub
Total Commits: 2550
Committers: 523
Average commits per author: 4.876
Development Distribution Score (DDS): 0.828
More commit stats: commits.ecosyste.ms
See more repository details: repos.ecosyste.ms
Funding links: https://github.com/sponsors/vllm-project, https://opencollective.com/["vllm"]
Last synced: about 18 hours ago