pypi.org "learning-rate-scheduling" keyword
View the packages on the pypi.org package registry that are tagged with the "learning-rate-scheduling" keyword.
lrbench 0.0.0.1
A learning rate recommending and benchmarking tool.1 version - Latest release: over 5 years ago - 1 dependent repositories - 28 downloads last month - 19 stars on GitHub - 1 maintainer
Top 9.7% on pypi.org
81 versions - Latest release: about 1 month ago - 1 dependent repositories - 144 thousand downloads last month - 288 stars on GitHub - 1 maintainer
pytorch_optimizer 3.5.0
optimizer & lr scheduler & objective function collections in PyTorch81 versions - Latest release: about 1 month ago - 1 dependent repositories - 144 thousand downloads last month - 288 stars on GitHub - 1 maintainer
abel-pytorch 0.0.1
ABEL Scheduler1 version - Latest release: about 4 years ago - 91 downloads last month - 3 stars on GitHub - 1 maintainer
optschedule 1.0.0
Flexible parameter scheduler that can be implemented with proprietary and open source optimizers ...2 versions - Latest release: 7 months ago - 51 downloads last month - 0 stars on GitHub - 1 maintainer
Top 3.4% on pypi.org
5 versions - Latest release: 5 months ago - 12 dependent packages - 81 dependent repositories - 18.2 thousand downloads last month - 390 stars on GitHub - 1 maintainer
pytorch-warmup 0.2.0
A PyTorch Extension for Learning Rate Warmup5 versions - Latest release: 5 months ago - 12 dependent packages - 81 dependent repositories - 18.2 thousand downloads last month - 390 stars on GitHub - 1 maintainer
Related Keywords
deep-learning
3
pytorch
3
QHM
1
SPAM
1
StableSPAM
1
SRMM
1
StableAdamW
1
SWATS
1
TAM
1
Tiger
1
TRAC
1
WSAM
1
Yogi
1
BCE
1
BCEFocal
1
Focal
1
FocalCosine
1
SoftF1
1
Dice
1
LDAM
1
Jaccard
1
Bi-Tempered
1
RAdam
1
Ranger
1
Ranger21
1
RotoGrad
1
SAM
1
GCSAM
1
LookSAM
1
ScheduleFreeSGD
1
ScheduleFreeAdamW
1
ScheduleFreeRAdam
1
SCION
1
SGDP
1
Shampoo
1
ScalableShampoo
1
SGDW
1
SignSGD
1
SM3
1
SOAP
1
SopihaH
1
Tversky
1
ranger
1
sam
1
learning rate
1
schedule
1
optimization
1
decay
1
learning
1
parameters
1
training
1
artificial-intelligence
1
gradient-descent
1
machine-learning
1
numerical-analysis
1
numerical-methods
1
python
1
scheduler
1
scheduling
1
adam
1
warmup
1
FocalTversky
1
LovaszHinge
1
bitsandbytes
1
WSD
1
QGaLore
1
adabelief
1
adabound
1
adai
1
adamd
1
adamp
1
adan
1
ademamix
1
diffgrad
1
gradient-centralization
1
lookahead
1
loss-functions
1
madgrad
1
nero
1
radam
1
LEARNING RATE
1
AdEMAMix
1
Simplified-AdEMAMix
1
ADOPT
1
AdaHessian
1
Adai
1
Adalite
1
AdaLomo
1
AdamMini
1
AdamP
1
AdamS
1
Adan
1
AggMo
1
Aida
1
AliG
1
Amos
1
Apollo
1
APOLLO
1
AvaGrad
1
bSAM
1