npmjs.org : @isdk/llama-node
Low-level Node.js bindings for llama.cpp. Core library for running LLM models locally with native performance and hardware acceleration support.
Registry
-
Source
- JSON
- codemeta.json
purl: pkg:npm/%40isdk/llama-node
Keywords:
llama
, llama-cpp
, llama.cpp
, bindings
, native
, napi
, addon
, ai
, llm
, gguf
, cmake
, cmake-js
, prebuilt-binaries
, metal
, cuda
, vulkan
, gpu
, cpu
, inference
, tokenize
, detokenize
, embedding
, lora
, adapter
, context
, model
, self-hosted
, local
, offline
, typescript
, low-level
, core
, library
License: MIT
Latest release: 2 months ago
First release: 2 months ago
Namespace: isdk
Downloads: 58 last month
Last synced: 22 days ago