proxy.golang.org : github.com/jakobhoeg/nextjs-ollama-llm-ui
Fully-featured, beautiful web interface for Ollama LLMs - built with NextJS. Deploy with a single click.
Registry
-
Source
- Documentation
- JSON
- codemeta.json
purl: pkg:golang/github.com/jakobhoeg/nextjs-ollama-llm-ui
Keywords:
ai
, chatbot
, gemma
, llm
, local
, localstorage
, mistral
, mistral-7b
, nextjs
, nextjs14
, offline
, ollama
, openai
, react
, shadcn
, tailwindcss
, typescript
License: MIT
Latest release: about 1 year ago
First release: over 1 year ago
Stars: 450 on GitHub
Forks: 99 on GitHub
See more repository details: repos.ecosyste.ms
Last synced: 29 days ago