pypi.org : ts-tokenizer
TS Tokenizer is a hybrid (lexicon-based and rule-based) tokenizer designed specifically for tokenizing Turkish texts.
Registry
-
Source
- Documentation
- JSON
purl: pkg:pypi/ts-tokenizer
Keywords:
turkish tokenizer
, tokenizer
, turkish
, nlp
, text-processing
, language-processing
, tokenization
License: MIT
Latest release: 3 months ago
First release: 11 months ago
Downloads: 450 last month
Stars: 1 on GitHub
Forks: 0 on GitHub
See more repository details: repos.ecosyste.ms
Last synced: 16 days ago