Ecosyste.ms: Packages
An open API service providing package, version and dependency metadata of many open source software ecosystems and registries.
pypi.org "learning-rate-scheduling" keyword
lrbench 0.0.0.1
A learning rate recommending and benchmarking tool.1 version - Latest release: over 4 years ago - 1 dependent repositories - 9 downloads last month - 19 stars on GitHub - 1 maintainer
Top 9.7% on pypi.org
66 versions - Latest release: 8 months ago - 1 dependent repositories - 46.6 thousand downloads last month - 194 stars on GitHub - 1 maintainer
pytorch_optimizer 2.12.0
optimizer & lr scheduler & objective function collections in PyTorch66 versions - Latest release: 8 months ago - 1 dependent repositories - 46.6 thousand downloads last month - 194 stars on GitHub - 1 maintainer
optschedule 0.1.0
Flexible parameter scheduler that can be implemented with proprietary and open source optimizers ...1 version - Latest release: about 1 year ago - 11 downloads last month - 0 stars on GitHub - 1 maintainer
abel-pytorch 0.0.1
ABEL Scheduler1 version - Latest release: about 3 years ago - 48 downloads last month - 3 stars on GitHub - 1 maintainer
Top 3.4% on pypi.org
4 versions - Latest release: over 1 year ago - 12 dependent packages - 81 dependent repositories - 137 thousand downloads last month - 363 stars on GitHub - 1 maintainer
pytorch-warmup 0.1.1
A PyTorch Extension for Learning Rate Warmup4 versions - Latest release: over 1 year ago - 12 dependent packages - 81 dependent repositories - 137 thousand downloads last month - 363 stars on GitHub - 1 maintainer
Related Keywords
deep-learning
3
pytorch
3
Yogi
1
BCE
1
BCEFocal
1
Focal
1
FocalCosine
1
SoftF1
1
Dice
1
LDAM
1
Jaccard
1
Bi-Tempered
1
Tversky
1
FocalTversky
1
LovaszHinge
1
bitsandbytes
1
adabelief
1
adabound
1
WSAM
1
Tiger
1
SWATS
1
SRMM
1
SopihaH
1
SM3
1
SignSGD
1
SGDW
1
ScalableShampoo
1
Shampoo
1
SGDP
1
ScheduleFreeAdamW
1
ScheduleFreeSGD
1
SAM
1
bSAM
1
warmup
1
adam
1
learning rate
1
scheduling
1
scheduler
1
python
1
numerical-methods
1
numerical-analysis
1
machine-learning
1
gradient-descent
1
artificial-intelligence
1
training
1
parameters
1
learning
1
decay
1
optimization
1
schedule
1
sam
1
ranger
1
radam
1
nero
1
madgrad
1
loss-functions
1
lookahead
1
gradient-centralization
1
diffgrad
1
chebyshev
1
adan
1
adamp
1
adamd
1
adai
1
AvaGrad
1
Apollo
1
Amos
1
AliG
1
Aida
1
AggMo
1
Adan
1
AdamS
1
AdamP
1
Adalite
1
Adai
1
AdaHessian
1
AdaSmooth
1
AdaPNM
1
AdaNorm
1
AdaMod
1
AdaMax
1
AdaFactor
1
AdaDelta
1
AdaBound
1
AdaBelief
1
AccSGD
1
ASGD
1
A2Grad
1
lr scheduler
1
optimizer
1
learning-rate-benchmarking
1
DEEP LEARNING
1
TRAINING
1
LEARNING RATE
1
RotoGrad
1
Ranger21
1
Ranger
1
RAdam
1
QHM
1
QHAdam
1