An open API service providing package, version and dependency metadata of many open source software ecosystems and registries.

npmjs.org : custom-koya-node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level

Registry - Source - Homepage - JSON
purl: pkg:npm/custom-koya-node-llama-cpp
Keywords: llama , llama-cpp , llama.cpp , bindings , ai , cmake , cmake-js , prebuilt-binaries , llm , gguf , metal , cuda , grammar , json-grammar , json-schema-grammar , temperature , topK , topP , json-schema , raspberry-pi , self-hosted , local , catai , embedding , function-calling , gpu , nodejs , vulkan
License: MIT
Latest release: about 1 year ago
First release: about 1 year ago
Downloads: 20 last month
Stars: 1,682 on GitHub
Forks: 146 on GitHub
Total Commits: 174
Committers: 6
Average commits per author: 29.0
Development Distribution Score (DDS): 0.063
More commit stats: commits.ecosyste.ms
See more repository details: repos.ecosyste.ms
Funding links: https://github.com/sponsors/giladgd
Last synced: about 19 hours ago

    Loading...
    Readme
    Loading...