pypi.org : llama-cpp-cffi : 0.4.43
Python binding for llama.cpp using cffi
Registry -
Documentation -
Download -
JSON
Integrity: sha256-cf8b6f5449eadc71ef02... -
purl: pkg:pypi/[email protected]
Published:
Indexed:
Loading...
Readme
Loading...
Dependencies
- aiohttp[speedups] <4.0.0,>=3.11.12 extra == "server" (optional)
- gunicorn ==23.0.0 extra == "server" (optional)
- openai <2.0.0,>=1.63.2 extra == "server" (optional)
- uvloop <0.22.0,>=0.21.0 extra == "server" (optional)
- attrs <26.0.0,>=25.1.0
- huggingface-hub <0.29.0,>=0.28.1
- jinja2 <4.0.0,>=3.1.5
- protobuf <6.0.0,>=5.29.3
- psutil <8.0.0,>=7.0.0
- sentencepiece <0.3.0,>=0.2.0
- transformers <5.0.0,>=4.49.0