Top 1.5% downloads on pypi.org
Top 9.6% forks on pypi.org
pypi.org : pytorch_optimizer
optimizer & lr scheduler & objective function collections in PyTorch
Registry
-
Source
- Documentation
- JSON
purl: pkg:pypi/pytorch-optimizer
Keywords:
pytorch
, deep-learning
, optimizer
, lr scheduler
, A2Grad
, ASGD
, AccSGD
, AdaBelief
, AdaBound
, AdaDelta
, AdaFactor
, AdaGC
, AdaMax
, AdamG
, AdaMod
, AdaNorm
, AdaPNM
, AdaSmooth
, AdEMAMix
, Simplified-AdEMAMix
, ADOPT
, AdaHessian
, Adai
, Adalite
, AdaLomo
, AdamMini
, AdamP
, AdamS
, Adan
, AggMo
, Aida
, AliG
, Amos
, Apollo
, APOLLO
, AvaGrad
, bSAM
, CAME
, DAdaptAdaGrad
, DAdaptAdam
, DAdaptAdan
, DAdaptSGD
, DAdaptLion
, DeMo
, DiffGrad
, EXAdam
, FAdam
, FOCUS
, Fromage
, FTRL
, GaLore
, Grams
, Gravity
, GrokFast
, GSAM
, Kate
, Lamb
, LaProp
, LARS
, Lion
, LOMO
, Lookahead
, MADGRAD
, MARS
, MSVAG
, Muno
, Nero
, NovoGrad
, OrthoGrad
, PAdam
, PCGrad
, PID
, PNM
, Prodigy
, PSGD
, QHAdam
, QHM
, RAdam
, Ranger
, Ranger21
, RotoGrad
, SAM
, GCSAM
, LookSAM
, ScheduleFreeSGD
, ScheduleFreeAdamW
, ScheduleFreeRAdam
, SCION
, SGDP
, Shampoo
, ScalableShampoo
, SGDW
, SignSGD
, SM3
, SOAP
, SopihaH
, SPAM
, StableSPAM
, SRMM
, StableAdamW
, SWATS
, TAM
, Tiger
, TRAC
, WSAM
, Yogi
, BCE
, BCEFocal
, Focal
, FocalCosine
, SoftF1
, Dice
, LDAM
, Jaccard
, Bi-Tempered
, Tversky
, FocalTversky
, LovaszHinge
, bitsandbytes
, WSD
, QGaLore
, adabelief
, adabound
, adai
, adamd
, adamp
, adan
, ademamix
, diffgrad
, gradient-centralization
, learning-rate-scheduling
, lookahead
, loss-functions
, madgrad
, nero
, radam
, ranger
, sam
License: Apache-2.0
Latest release: about 1 month ago
First release: over 3 years ago
Dependent repositories: 1
Downloads: 76,117 last month
Stars: 278 on GitHub
Forks: 24 on GitHub
Total Commits: 1587
Committers: 3
Average commits per author: 529.0
Development Distribution Score (DDS): 0.004
More commit stats: commits.ecosyste.ms
See more repository details: repos.ecosyste.ms
Last synced: about 1 month ago