npmjs.org : tokensize
The `tokenizer` function uses the `js-tiktoken` library to encode the input string into tokens using the GPT-2 encoding scheme. It then decodes the tokens back into strings, maps the tokens to their positions in the input string using the `mapTokensToChun
Registry
- JSON
purl: pkg:npm/tokensize
License: ISC
Latest release: over 1 year ago
First release: over 1 year ago
Downloads: 6 last month
Last synced: 28 days ago
Loading...
Readme
Loading...