An open API service providing package, version and dependency metadata of many open source software ecosystems and registries.

npmjs.org : llama-cpp-capacitor

A native Capacitor plugin that embeds llama.cpp directly into mobile apps, enabling offline AI inference with chat-first API design. Supports both simple text generation and advanced chat conversations with system prompts, multimodal processing, TTS, LoRA

Registry - Source - Homepage - JSON
purl: pkg:npm/llama-cpp-capacitor
Keywords: capacitor , plugin , native , llama , llama.cpp , ai , machine-learning , offline-ai , text-generation , multimodal , tts , text-to-speech , lora , embeddings , reranking , chat-completion , gguf , large-language-model , llm
License: MIT
Latest release: 22 days ago
First release: about 1 month ago
Downloads: 1,213 last month
Stars: 0 on GitHub
Forks: 0 on GitHub
See more repository details: repos.ecosyste.ms
Last synced: 19 days ago

    Loading...
    Readme
    Loading...