Top 1.3% downloads on pypi.org
Top 0.7% dependent packages on pypi.org
Top 0.4% dependent repos on pypi.org
Top 0.1% forks on pypi.org
Top 1.6% docker downloads on pypi.org
pypi.org : pytorch-transformers
Repository of pre-trained NLP Transformer models: BERT & RoBERTa, GPT & GPT-2, Transformer-XL, XLNet and XLM
Registry
-
Source
- Documentation
- JSON
purl: pkg:pypi/pytorch-transformers
Keywords:
NLP
, deep
, learning
, transformer
, pytorch
, BERT
, GPT
, GPT-2
, google
, openai
, CMU
, bert
, deep-learning
, flax
, hacktoberfest
, jax
, language-model
, language-models
, machine-learning
, model-hub
, natural-language-processing
, nlp
, nlp-library
, pretrained-models
, python
, pytorch-transformers
, seq2seq
, speech-recognition
, tensorflow
License: Apache-2.0
Latest release: over 5 years ago
First release: almost 6 years ago
Dependent packages: 16
Dependent repositories: 772
Downloads: 27,221 last month
Stars: 129,185 on GitHub
Forks: 25,613 on GitHub
Docker dependents: 57
Docker downloads: 3,735
Total Commits: 14513
Committers: 2380
Average commits per author: 6.098
Development Distribution Score (DDS): 0.93
More commit stats: commits.ecosyste.ms
See more repository details: repos.ecosyste.ms
Last synced: about 3 hours ago