Ecosyste.ms: Packages
An open API service providing package, version and dependency metadata of many open source software ecosystems and registries.
proxy.golang.org "stopwords" keyword
Top 8.6% on proxy.golang.org
4 versions - Latest release: over 1 year ago - 3 stars on GitHub
github.com/rekram1-node/tokenizer v0.4.0
Natural Language Processing (NLP) Tokenization Libary designed for English. Fast, Lean, Customiza...4 versions - Latest release: over 1 year ago - 3 stars on GitHub
Top 2.3% on proxy.golang.org
1 version - Latest release: over 6 years ago - 33 dependent packages - 42 dependent repositories - 117 stars on GitHub
github.com/bbalet/stopwords v1.0.0
Package stopwords allows you to customize the list of stopwords Package stopwords implements the...1 version - Latest release: over 6 years ago - 33 dependent packages - 42 dependent repositories - 117 stars on GitHub
Top 9.9% on proxy.golang.org
4 versions - Latest release: about 3 years ago - 1 dependent repositories - 6 stars on GitHub
github.com/cvcio/go-plagiarism v0.2.2
Plagiarism detection using stopwords n-grams4 versions - Latest release: about 3 years ago - 1 dependent repositories - 6 stars on GitHub
Top 7.7% on proxy.golang.org
1 version - Latest release: over 1 year ago - 1 dependent repositories - 33 stars on GitHub
github.com/yihleego/trie v0.0.0-20220914121334-78377532f78e
📒 An Aho-Corasick algorithm based string-searching utility for Go. It supports tokenization, igno...1 version - Latest release: over 1 year ago - 1 dependent repositories - 33 stars on GitHub
Top 6.2% on proxy.golang.org
4 versions - Latest release: over 7 years ago - 373 stars on GitHub
github.com/stopwords-iso/stopwords-iso v0.4.0
All languages stopwords collection4 versions - Latest release: over 7 years ago - 373 stars on GitHub
Related Keywords
go
3
golang
3
language
1
string-searching
1
sensitive
1
keywords
1
java
1
aho-corasick
1
plagiarism-detection
1
plagiarism
1
n-grams
1
algorithm
1
simhash
1
levenshtein-distance
1
levenshtein
1
golang-package
1
golang-library
1
distance
1
tokenizer
1
tokenization
1
token
1
speed
1
nlp
1
natural-language-processing
1
minimal
1
machine-learning
1
fast
1
customization
1
contractions
1
blazingly-fast
1