npmjs.org : llama.native.js
use `npm i --save llama.native.js` to run lama.cpp models on your local machine. features a socket.io server and client that can do inference with the host of the model.
Registry
- JSON
purl: pkg:npm/llama.native.js
Keywords:
llm
, wrapper
, huggingface
, javascript
, llama.cpp
, socket.io
, dalai
, quantized
, cpu
, text
, generation
License: Other
Latest release: almost 2 years ago
First release: almost 2 years ago
Downloads: 52 last month
Last synced: 13 days ago
@everything-registry/sub-chunk-2089 0.1.0 removed
npm install @everything-registry/sub-chunk-20891 version - Latest release: over 1 year ago - 152 stars on GitHub - 2 maintainers
Check this option to include packages that no longer depend on this package in their latest version but previously did.