npmjs.org : @duck4i/llama
Native Node.JS plugin to run LLAMA inference directly on your machine with no other dependencies.
Registry
-
Source
- Homepage
- JSON
purl: pkg:npm/%40duck4i/llama
Keywords:
llama
, node
, gguf
, LLM
, inference
, llamacpp
, llm
, llm-inference
, local
, nodejs
License: MIT
Latest release: 3 months ago
First release: 4 months ago
Namespace: duck4i
Downloads: 48 last month
Stars: 0 on GitHub
Forks: 0 on GitHub
See more repository details: repos.ecosyste.ms
Last synced: 8 days ago