An open API service providing package, version and dependency metadata of many open source software ecosystems and registries.

npmjs.org : custom-koya-node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level

Registry - Source - Homepage - JSON
purl: pkg:npm/custom-koya-node-llama-cpp
Keywords: llama , llama-cpp , llama.cpp , bindings , ai , cmake , cmake-js , prebuilt-binaries , llm , gguf , metal , cuda , grammar , json-grammar , json-schema-grammar , temperature , topK , topP , json-schema , raspberry-pi , self-hosted , local , catai , embedding , function-calling , gpu , nodejs , vulkan
License: MIT
Latest release: 9 months ago
First release: 9 months ago
Downloads: 46 last month
Stars: 1,447 on GitHub
Forks: 122 on GitHub
Total Commits: 174
Committers: 6
Average commits per author: 29.0
Development Distribution Score (DDS): 0.063
More commit stats: commits.ecosyste.ms
See more repository details: repos.ecosyste.ms
Funding links: https://github.com/sponsors/giladgd
Last synced: about 1 hour ago

Readme
Error loading readme