pypi.org "data-parallelism" keyword
View the packages on the pypi.org package registry that are tagged with the "data-parallelism" keyword.
Top 1.7% on pypi.org
38 versions - Latest release: about 2 months ago - 9 dependent packages - 63 dependent repositories - 15.1 thousand downloads last month - 37,740 stars on GitHub - 1 maintainer
colossalai 0.4.9
An integrated large-scale model training system with efficient parallelization techniques38 versions - Latest release: about 2 months ago - 9 dependent packages - 63 dependent repositories - 15.1 thousand downloads last month - 37,740 stars on GitHub - 1 maintainer
Top 0.3% on pypi.org
106 versions - Latest release: 4 days ago - 109 dependent packages - 3,724 dependent repositories - 674 thousand downloads last month - 32,440 stars on GitHub - 3 maintainers
deepspeed 0.16.7
DeepSpeed library106 versions - Latest release: 4 days ago - 109 dependent packages - 3,724 dependent repositories - 674 thousand downloads last month - 32,440 stars on GitHub - 3 maintainers
Top 7.8% on pypi.org
3 versions - Latest release: over 7 years ago - 1 dependent repositories - 568 downloads last month - 622 stars on GitHub - 1 maintainer
dist-keras 0.2.1
Distributed Deep learning with Apache Spark with Keras.3 versions - Latest release: over 7 years ago - 1 dependent repositories - 568 downloads last month - 622 stars on GitHub - 1 maintainer
colossalai-nightly 2025.4.12
An integrated large-scale model training system with efficient parallelization techniques109 versions - Latest release: 11 days ago - 2.91 thousand downloads last month - 40,772 stars on GitHub - 1 maintainer
custom-colossalai 0.4.5
An integrated large-scale model training system with efficient parallelization techniques16 versions - Latest release: 6 months ago - 445 downloads last month - 40,772 stars on GitHub - 1 maintainer
adeepspeed 0.9.2
DeepSpeed library1 version - Latest release: over 1 year ago - 71 downloads last month - 37,891 stars on GitHub - 1 maintainer
veloce 0.0.1rc2
Veloce: An instant distributed computing library based on Ray stack3 versions - Latest release: about 3 years ago - 1 dependent repositories - 76 downloads last month - 18 stars on GitHub - 1 maintainer
fleet-x 0.0.8
飞桨大模型开发套件,提供大语言模型、跨模态大模型、生物计算大模型等领域的全流程开发工具链。9 versions - Latest release: over 4 years ago - 1 dependent repositories - 333 downloads last month - 464 stars on GitHub - 1 maintainer
pipegoose 0.2.0
Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in ...2 versions - Latest release: over 1 year ago - 101 downloads last month - 68 stars on GitHub - 1 maintainer
paddle-fleet 0.1.1
Distributed Training Package Based on PaddlePaddle1 version - Latest release: over 5 years ago - 1 dependent repositories - 61 downloads last month - 464 stars on GitHub - 1 maintainer
fleet-lightning 0.0.3
飞桨大模型开发套件,提供大语言模型、跨模态大模型、生物计算大模型等领域的全流程开发工具链。4 versions - Latest release: over 4 years ago - 1 dependent repositories - 156 downloads last month - 461 stars on GitHub - 1 maintainer
Related Keywords
model-parallelism
10
deep-learning
9
pipeline-parallelism
9
large-scale
6
inference
5
distributed-computing
4
unsupervised-learning
3
self-supervised-learning
3
pretraining
3
paddlecloud
3
lightning
3
paddlepaddle
3
distributed-training
3
benchmark
3
cloud
3
distributed-algorithm
3
elastic
3
fleet-api
3
ai
3
big-model
3
foundation-models
3
heterogeneous-training
3
hpc
3
machine-learning
3
mixture-of-experts
3
pytorch
3
billion-parameters
2
compression
2
gpu
2
trillion-parameters
2
distributed-optimizers
2
zero
2
large-scale-language-modeling
1
huggingface-transformers
1
3d-parallelism
1
megatron
1
megatron-lm
1
moe
1
sequence-parallelism
1
tensor-parallelism
1
transformers
1
zero-1
1
online-training
1
batch-training
1
easy-to-use
1
Keras
1
Deep Learning
1
Machine Learning
1
Theano
1
Tensorflow
1
Distributed
1
Apache Spark
1
apache-spark
1
data-science
1
hadoop
1
keras
1
optimization-algorithms
1
tensorflow
1
distributed
1
heterogeneity
1
parameter-server
1
ray
1
sparsity
1