npmjs.org : llama.native.js
use `npm i --save llama.native.js` to run lama.cpp models on your local machine. features a socket.io server and client that can do inference with the host of the model.
Registry
- JSON
purl: pkg:npm/llama.native.js
Keywords:
llm
, wrapper
, huggingface
, javascript
, llama.cpp
, socket.io
, dalai
, quantized
, cpu
, text
, generation
License: Other
Latest release: almost 2 years ago
First release: almost 2 years ago
Downloads: 52 last month
Last synced: 10 days ago