npmjs.org : llm-guard
A TypeScript library for validating and securing LLM prompts
Registry
-
Source
- Homepage
- JSON
purl: pkg:npm/llm-guard
Keywords:
llm
, security
, validation
, prompt
, jailbreak
, pii
, toxicity
, profanity
, prompt-injection
, relevance
License: MIT
Latest release: 14 days ago
First release: 14 days ago
Downloads: 198 last month
Stars: 0 on GitHub
Forks: 0 on GitHub
See more repository details: repos.ecosyste.ms
Last synced: 6 days ago