Ecosyste.ms: Packages
An open API service providing package, version and dependency metadata of many open source software ecosystems and registries.
pypi.org "pipeline-parallelism" keyword
Top 4.8% on pypi.org
18 versions - Latest release: 9 months ago - 4 dependent packages - 2 dependent repositories - 516 downloads last month - 8,730 stars on GitHub - 3 maintainers
petals 2.2.0
Easy way to efficiently run 100B+ language models without high-end GPUs18 versions - Latest release: 9 months ago - 4 dependent packages - 2 dependent repositories - 516 downloads last month - 8,730 stars on GitHub - 3 maintainers
test-petals 2.2.0.post1
Easy way to efficiently run 100B+ language models without high-end GPUs1 version - Latest release: 6 months ago - 19 downloads last month - 8,203 stars on GitHub - 1 maintainer
colossalai-nightly 2024.5.18
An integrated large-scale model training system with efficient parallelization techniques70 versions - Latest release: 16 days ago - 519 downloads last month - 37,740 stars on GitHub - 1 maintainer
paddle-fleet 0.1.1
Distributed Training Package Based on PaddlePaddle1 version - Latest release: over 4 years ago - 1 dependent repositories - 26 downloads last month - 425 stars on GitHub - 1 maintainer
Top 1.7% on pypi.org
26 versions - Latest release: about 1 month ago - 9 dependent packages - 63 dependent repositories - 10.2 thousand downloads last month - 37,740 stars on GitHub - 1 maintainer
colossalai 0.3.7
An integrated large-scale model training system with efficient parallelization techniques26 versions - Latest release: about 1 month ago - 9 dependent packages - 63 dependent repositories - 10.2 thousand downloads last month - 37,740 stars on GitHub - 1 maintainer
fleet-x 0.0.8
飞桨大模型开发套件,提供大语言模型、跨模态大模型、生物计算大模型等领域的全流程开发工具链。9 versions - Latest release: over 3 years ago - 1 dependent repositories - 70 downloads last month - 422 stars on GitHub - 1 maintainer
pipegoose 0.2.0
Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in ...2 versions - Latest release: 7 months ago - 12 downloads last month - 68 stars on GitHub - 1 maintainer
Top 0.3% on pypi.org
90 versions - Latest release: about 1 month ago - 109 dependent packages - 3,724 dependent repositories - 509 thousand downloads last month - 32,440 stars on GitHub - 3 maintainers
deepspeed 0.14.2
DeepSpeed library90 versions - Latest release: about 1 month ago - 109 dependent packages - 3,724 dependent repositories - 509 thousand downloads last month - 32,440 stars on GitHub - 3 maintainers
Top 5.7% on pypi.org
8 versions - Latest release: over 3 years ago - 1 dependent package - 4 dependent repositories - 363 downloads last month - 779 stars on GitHub - 2 maintainers
torchgpipe 0.0.7
GPipe for PyTorch8 versions - Latest release: over 3 years ago - 1 dependent package - 4 dependent repositories - 363 downloads last month - 779 stars on GitHub - 2 maintainers
adeepspeed 0.9.2
DeepSpeed library1 version - Latest release: 10 months ago - 32 downloads last month - 32,440 stars on GitHub - 1 maintainer
fleet-lightning 0.0.3
飞桨大模型开发套件,提供大语言模型、跨模态大模型、生物计算大模型等领域的全流程开发工具链。4 versions - Latest release: almost 4 years ago - 1 dependent repositories - 36 downloads last month - 422 stars on GitHub - 1 maintainer
Related Keywords
deep-learning
9
model-parallelism
9
data-parallelism
8
large-scale
5
pytorch
5
inference
4
machine-learning
4
distributed-training
3
benchmark
3
cloud
3
distributed-algorithm
3
elastic
3
tensor-parallelism
3
fleet-api
3
lightning
3
paddlecloud
3
paddlepaddle
3
pretraining
3
self-supervised-learning
3
unsupervised-learning
3
mixture-of-experts
3
zero
2
trillion-parameters
2
gpu
2
compression
2
billion-parameters
2
bloom
2
chatbot
2
distributed-systems
2
falcon
2
gpt
2
guanaco
2
language-models
2
large-language-models
2
llama
2
llama2
2
neural-networks
2
nlp
2
pretrained-models
2
transformer
2
volunteer-computing
2
ai
2
big-model
2
distributed-computing
2
foundation-models
2
heterogeneous-training
2
hpc
2
gpipe
1
checkpointing
1
parallelism
1
zero-1
1
transformers
1
sequence-parallelism
1
moe
1
megatron-lm
1
megatron
1
large-scale-language-modeling
1
huggingface-transformers
1
distributed-optimizers
1
3d-parallelism
1
easy-to-use
1
batch-training
1
online-training
1