Topic: distributed-deep-learning Goto Github
Some thing interesting about distributed-deep-learning
Some thing interesting about distributed-deep-learning
distributed-deep-learning,This repository contains the implementation of a wide variety of Deep Learning Projects in different applications of computer vision, NLP, federated, and distributed learning. These projects include university projects and projects implemented due to interest in Deep Learning.
User: amirhosein-mesbah
distributed-deep-learning,Distributed Deep Reinforcement Learning for Large Scale Robotic Simulations ๐จโ๐ป๐ค๐ธ๐น๐ทโค๏ธ๐จโ๐ฌ
User: amrmkayid
distributed-deep-learning,Yelp review classification using CNN model with horovod on HPC cluster
User: bilalsp
distributed-deep-learning,Simultaneous Multi-Party Learning Framework
User: ch3njust1n
distributed-deep-learning,TensorFlow (1.8+) Datasets, Feature Columns, Estimators and Distributed Training using Google Cloud Machine Learning Engine
User: christianramsey
Home Page: http://dyadxmachina.com
distributed-deep-learning,Java based Convolutional Neural Network package running on Apache Spark framework
Organization: deepspark
distributed-deep-learning,Distributed Keras Engine, Make Keras faster with only one line of code.
Organization: dkeras-project
distributed-deep-learning,Learn applied deep learning from zero to deployment using TensorFlow 1.8+
Organization: dyadxmachina
distributed-deep-learning,SHUKUN Technology Co.,Ltd Algorithm intern (2020/12-2021/5). Multi-GPU, Multi-node training for deep learning models. Horovod, NVIDIA clara train sdk, configuration tutorial,performance testing.
User: explcre
distributed-deep-learning,๐จ Prediction of the Resource Consumption of Distributed Deep Learning Systems
User: gsyang33
distributed-deep-learning,sensAI: ConvNets Decomposition via Class Parallelism for Fast Inference on Live Data
User: guanhuawang
Home Page: https://rise.cs.berkeley.edu/projects/sensai/
distributed-deep-learning,Distributed Tensorflow, Keras and PyTorch on Apache Spark/Flink & Ray
Organization: intel-analytics
Home Page: https://analytics-zoo.readthedocs.io/
distributed-deep-learning,Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Baichuan, Mixtral, Gemma, Phi, etc.) on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max); seamlessly integrate with llama.cpp, Ollama, HuggingFace, LangChain, LlamaIndex, DeepSpeed, vLLM, FastChat, Axolotl, etc.
Organization: intel-analytics
Home Page: https://ipex-llm.readthedocs.io
distributed-deep-learning,Intelยฎ End-to-End AI Optimization Kit
Organization: intel
distributed-deep-learning,Distributed deep learning framework based on pytorch/numba/nccl and zeromq.
User: lancelee82
distributed-deep-learning,mnist, using caffe and openmpi
User: pierric
distributed-deep-learning,SHADE: Enable Fundamental Cacheability for Distributed Deep Learning Training
User: r-i-s-khan
Home Page: https://www.usenix.org/conference/fast23/presentation/khan
distributed-deep-learning,Scalable NLP model fine-tuning and batch inference with Ray and Anyscale
Organization: ray-project
Home Page: https://partners.awscloud.com/MLOps--Distributed-Training-Workshops.html
distributed-deep-learning,RocketML Deep Neural Networks
Organization: rocketmlhq
Home Page: https://rocketmlhq.github.io/rmldnn/
distributed-deep-learning,Chimera: Efficiently Training Large-Scale Neural Networks with Bidirectional Pipelines.
User: shigangli
distributed-deep-learning,Eager-SGD is a decentralized asynchronous SGD. It utilizes novel partial collectives operations to accumulate the gradients across all the processes.
User: shigangli
distributed-deep-learning,Ok-Topk is a scheme for distributed training with sparse gradients. Ok-Topk integrates a novel sparse allreduce algorithm (less than 6k communication volume which is asymptotically optimal) with the decentralized parallel Stochastic Gradient Descent (SGD) optimizer, and its convergence is proved theoretically and empirically.
User: shigangli
distributed-deep-learning,WAGMA-SGD is a decentralized asynchronous SGD based on wait-avoiding group model averaging. The synchronization is relaxed by making the collectives externally-triggerable, namely, a collective can be initiated without requiring that all the processes enter it. It partially reduces the data within non-overlapping groups of process, improving the parallel scalability.
User: shigangli
distributed-deep-learning,Implemented training strategies to help improve bottlenecks and to improve the training speed while maintaining the quality of our GANs.
User: siddhanthiyer-99
distributed-deep-learning,An implementation of a distributed ResNet model for classifying CIFAR-10 and MNIST datasets.
User: sotheanithsok
distributed-deep-learning,A blockchain based neural architecture search project.
User: sqaz91819
distributed-deep-learning,Collection of resources for automatic deployment of distributed deep learning jobs on a Kubernetes cluster
User: stefanofioravanzo
distributed-deep-learning,PyTorch Examples for Beginners
User: trilliwon
distributed-deep-learning,Horovod Tutorial for Pytorch using NVIDIA-Docker.
User: veritas9872
distributed-deep-learning,A Portable C Library for Distributed CNN Inference on IoT Edge Clusters
User: zoranzhao
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.