pypi.org : llama-cpp-cffi : 0.2.3
Python binding for llama.cpp using cffi
Registry -
Documentation -
Download -
JSON
Integrity: sha256-767e36a50d567da4e0da... -
purl: pkg:pypi/[email protected]
Published:
Indexed:
Loading...
Readme
Loading...
Dependencies
- aiohttp[speedups] <4.0.0,>=3.10.10 extra == "openai" (optional)
- openai <2.0.0,>=1.53.0 extra == "openai" (optional)
- uvloop <0.22.0,>=0.21.0 extra == "uvloop" (optional)
- attrs <25.0.0,>=24.2.0
- cffi <2.0.0,>=1.17.1
- huggingface-hub <0.27.0,>=0.26.2
- jinja2 <4.0.0,>=3.1.4
- numba <0.61.0,>=0.60.0
- protobuf <6.0.0,>=5.28.3
- psutil <7.0.0,>=6.1.0
- sentencepiece <0.3.0,>=0.2.0
- setuptools <76.0.0,>=75.3.0
- transformers <5.0.0,>=4.46.1
- vulkan <2.0.0.0,>=1.3.275.1