@aibrow/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
Ecosystem
npmjs.org
npmjs.org
Latest Release
8 months ago
1.7.0
8 months ago
Versions
12
12
Downloads
395 last month
395 last month
Links
| Registry | npmjs.org |
| Source | Repository |
| Homepage | Homepage |
| JSON API | View JSON |
| CodeMeta | codemeta.json |
Package Details
| PURL |
pkg:npm/%40aibrow/node-llama-cpp
spec |
| License | MIT |
| Namespace | aibrow |
| First Release | 10 months ago |
| Last Synced | 16 days ago |
Keywords
llama llama-cpp llama.cpp bindings ai cmake cmake-js prebuilt-binaries llm gguf metal cuda vulkan grammar embedding rerank reranking json-grammar json-schema-grammar functions function-calling token-prediction speculative-decoding temperature minP topK topP seed json-schema raspberry-pi self-hosted local catai mistral deepseek qwen qwq typescript lora batching gpu
llama llama-cpp llama.cpp bindings ai cmake cmake-js prebuilt-binaries llm gguf metal cuda vulkan grammar embedding rerank reranking json-grammar json-schema-grammar functions function-calling token-prediction speculative-decoding temperature minP topK topP seed json-schema raspberry-pi self-hosted local catai mistral deepseek qwen qwq typescript lora batching gpu
Repository
| Stars | 0 on GitHub |
| Forks | 0 on GitHub |