Topic: data-parallelism Goto Github
Some thing interesting about data-parallelism
Some thing interesting about data-parallelism
data-parallelism,Easy Parallel Library (EPL) is a general and efficient deep learning framework for distributed model training.
Organization: alibaba
data-parallelism,Fast and easy distributed model training examples.
Organization: alibabapai
data-parallelism,The project is focused on parallelising pre-processing, measuring and machine learning in the cloud, as well as the evaluation and analysis of the cloud performance.
User: anvesham
data-parallelism,SIMD multithreaded Monte Carlo options pricer in Rust 🦀
User: ashayp22
data-parallelism,Sequential and Parallel Implementation of the Hodgkin-Huxley Neuron model.
User: axr6077
data-parallelism,Complex ray tracing algorithm optimized by using parallelization over different partitioning schemes and explore the performance gains through grain size and processing units (parameters) over sequential algorithm to render a high resolution image.
User: axr6077
data-parallelism,Distributed Deep Learning, with a focus on distributed training, using Keras and Apache Spark.
Organization: cerndb
Home Page: http://joerihermans.com/work/distributed-keras/
data-parallelism,This repository provides hands-on labs on PyTorch-based Distributed Training and SageMaker Distributed Training. It is written to make it easy for beginners to get started, and guides you through step-by-step modifications to the code based on the most basic BERT use cases.
User: daekeun-ml
data-parallelism,Distributed Keras Engine, Make Keras faster with only one line of code.
Organization: dkeras-project
data-parallelism,A decentralized and distributed framework for training DNNs
Organization: dscpesu
data-parallelism,Example of Distributed pyTorch
User: eunjuyang
data-parallelism,pipeDejavu: Hardware-aware Latency Predictable, Differentiable Search for Faster Config and Convergence of Distributed ML Pipeline Parallelism
User: explcre
data-parallelism,Distributed training (multi-node) of a Transformer model
User: hkproj
Home Page: https://www.youtube.com/watch?v=toUSzwR0EV8
data-parallelism,Making large AI models cheaper, faster and more accessible
Organization: hpcaitech
Home Page: https://www.colossalai.org
data-parallelism,OpenCL powered Merklization using BLAKE3
User: itzmeanjan
data-parallelism,A fully distributed hyperparameter optimization tool for PyTorch DNNs
User: joelrorseth
data-parallelism,Multi-GPU training for Keras
User: kuixu
data-parallelism,Development of Project HPGO | Hybrid Parallelism Global Orchestration
User: ler0ever
Home Page: https://HPGO.rongyi.io
data-parallelism,DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Organization: microsoft
Home Page: https://www.deepspeed.ai/
data-parallelism,A state-of-the-art multithreading runtime: message-passing based, fast, scalable, ultra-low overhead
User: mratsim
data-parallelism,Single-node data parallelism in Julia with CUDA
Organization: murrellgroup
data-parallelism,A C# based download manager that uses task-based programming using Data parallelism, Task Parallel Library in C# Scheduling, controlling and managing tasks
User: nadeemfazloon
data-parallelism,Understanding the effects of data parallelism and sparsity on neural network training
User: namhoonlee
data-parallelism,MapReduceSimulator for Scheduling and Provisioning Algorithms
Organization: ncl-teu
data-parallelism,SC23 Deep Learning at Scale Tutorial Material
Organization: nersc
data-parallelism,Torch Automatic Distributed Neural Network (TorchAD-NN) training library. Built on top of TorchMPI, this module automatically parallelizes neural network training.
User: ngrabaskas
data-parallelism,Scaling Unet in Pytorch
User: oekosheri
data-parallelism,Scaling Unet in Tensorflow
User: oekosheri
data-parallelism,LiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training
Organization: oneflow-inc
Home Page: https://libai.readthedocs.io
data-parallelism,Official Repository for the paper: Distributing Deep Learning Hyperparameter Tuning for 3D Medical Image Segmentation
User: oriolaranda
data-parallelism,飞桨大模型开发套件,提供大语言模型、跨模态大模型、生物计算大模型等领域的全流程开发工具链。
Organization: paddlepaddle
data-parallelism,A mostly POSIX-compliant utility that scans a given interval for vampire numbers.
User: plerros
data-parallelism,WIP. Veloce is a low-code Ray-based parallelization library that makes machine learning computation novel, efficient, and heterogeneous.
User: ryantd
data-parallelism,The project utilizes OpenMP to implement parallelism in a large dataset by leveraging multicore processor architectures to concurrently execute code sections, optimizing performance and scalability for efficient database processing
User: shruthimohan03
data-parallelism,Batch Partitioning for Multi-PE Inference with TVM (2020)
User: sjlee25
data-parallelism,Data parallel and stream parallel skeletons implemented in erlang
User: sre990
Home Page: https://ske-pi.sourceforge.io/
data-parallelism,Binary data classification using TensorFlow and Keras in python and achieving data parallelism using MPI
User: sujith013
data-parallelism,CUDA C parallel implementations of some well-known algorithms.
User: t0re199
data-parallelism,CUDA C parallel implementation of the Merge operation.
User: t0re199
data-parallelism,Towards Rehearsal-based Continual Learning at Scale: distributed CL with Horovod + PyTorch
User: thomas-bouvier
data-parallelism,Orkhon: ML Inference Framework and Server Runtime
User: vertexclique
data-parallelism,Ternary Gradients to Reduce Communication in Distributed Deep Learning (TensorFlow)
User: wenwei202
data-parallelism,Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*
User: xrsrke
data-parallelism,:coffee:Implement of Parallel Matrix Multiplication Methods Using FOX Algorithm on Peking University's High-performance Computing System
User: yangyang14641
data-parallelism,Dependence-Based Code Transformation for Coarse-Grained Parallelism
User: zbjob
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.