npmjs.org : @aibrow/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
Registry
-
Source
- Homepage
- JSON
purl: pkg:npm/%40aibrow/node-llama-cpp
Keywords:
llama
, llama-cpp
, llama.cpp
, bindings
, ai
, cmake
, cmake-js
, prebuilt-binaries
, llm
, gguf
, metal
, cuda
, vulkan
, grammar
, embedding
, rerank
, reranking
, json-grammar
, json-schema-grammar
, functions
, function-calling
, token-prediction
, speculative-decoding
, temperature
, minP
, topK
, topP
, seed
, json-schema
, raspberry-pi
, self-hosted
, local
, catai
, mistral
, deepseek
, qwen
, qwq
, typescript
, lora
, batching
, gpu
License: MIT
Latest release: 12 days ago
First release: 25 days ago
Namespace: aibrow
Downloads: 578 last month
Stars: 0 on GitHub
Forks: 0 on GitHub
See more repository details: repos.ecosyste.ms
Funding links: https://github.com/sponsors/giladgd
Last synced: 42 minutes ago