pypi.org : llama-cpp-cffi : 0.1.3
Python binding for llama.cpp using cffi
Registry -
Documentation -
Download -
JSON
Integrity: sha256-fdcbf46b1885e9e9a062... -
purl: pkg:pypi/[email protected]
Published:
Indexed:
Loading...
Readme
Loading...
Dependencies
- aiohttp[speedups] <4.0.0,>=3.9.5 extra == "openai" (optional)
- openai <2.0.0,>=1.35.15 extra == "openai" (optional)
- uvloop <0.20.0,>=0.19.0 extra == "uvloop" (optional)
- attrs <24.0.0,>=23.2.0
- cffi <2.0.0,>=1.16.0
- huggingface-hub <0.25.0,>=0.24.0
- jinja2 <4.0.0,>=3.1.4
- numba <0.61.0,>=0.60.0
- protobuf <6.0.0,>=5.27.2
- psutil <7.0.0,>=6.0.0
- sentencepiece <0.3.0,>=0.2.0
- setuptools <72.0.0,>=71.0.3
- transformers <5.0.0,>=4.42.4