Topic: sparsity Goto Github
Some thing interesting about sparsity
Some thing interesting about sparsity
sparsity,Attention-Based Guided Structured Sparsity of Deep Neural Networks
User: astorfi
sparsity,Sparse Optimisation Research Code
User: bwohlberg
Home Page: http://brendt.wohlberg.net/software/SPORCO/
sparsity,MoMA: Modern Multivariate Analysis in R
Organization: dataslingers
Home Page: https://DataSlingers.github.io/MoMA
sparsity,Always sparse. Never dense. But never say never. A Sparse Training repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).
User: dcmocanu
Home Page: https://www.nature.com/articles/s41467-018-04316-3
sparsity,Official PyTorch (Lightning) implementation of the NeurIPS 2020 paper "Efficient Marginalization of Discrete and Structured Latent Variables via Sparsity".
Organization: deep-spin
sparsity,[ICLR 2023] Pruning Deep Neural Networks from a Sparsity Perspective
User: diaoenmao
sparsity,Network Slimming (Pytorch) (ICCV 2017)
User: eric-mingjie
sparsity,[NeurIPS'23] H2O: Heavy-Hitter Oracle for Efficient Generative Inference of Large Language Models.
Organization: fminference
sparsity,model compression and optimization for deployment for Pytorch, including knowledge distillation, quantization and pruning.(知识蒸馏,量化,剪枝)
User: huangcongqing
sparsity,SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
Organization: intel
Home Page: https://intel.github.io/neural-compressor/
sparsity,An innovative library for efficient LLM inference via low-bit quantization
Organization: intel
Home Page: https://github.com/intel/neural-speed
sparsity,Learning both Weights and Connections for Efficient Neural Networks https://arxiv.org/abs/1506.02626
User: jack-willturner
sparsity,[ICCV2023 Official PyTorch code] for Iterative Soft Shrinkage Learning for Efficient Image Super-Resolution
User: jiamian-wang
sparsity,Official pytorch code for "APP: Anytime Progressive Pruning" (DyNN @ ICML, 2022; CLL @ ACML, 2022, SNN @ ICML, 2022 and SlowDNN 2023)
Organization: landskape-ai
Home Page: https://arxiv.org/abs/2204.01640
sparsity,A research library for pytorch-based neural network pruning, compression, and more.
User: lucaslie
Home Page: https://people.csail.mit.edu/lucasl/
sparsity,Official Pytorch Implementation of "Outlier Weighed Layerwise Sparsity (OWL): A Missing Secret Sauce for Pruning LLMs to High Sparsity"
User: luuyin
Home Page: https://arxiv.org/pdf/2310.05175.pdf
sparsity,Reference ImageNet implementation of SelecSLS CNN architecture proposed in the SIGGRAPH 2020 paper "XNect: Real-time Multi-Person 3D Motion Capture with a Single RGB Camera". The repository also includes code for pruning the model based on implicit sparsity emerging from adaptive gradient descent methods, as detailed in the CVPR 2019 paper "On implicit filter level sparsity in Convolutional Neural Networks".
User: mehtadushy
sparsity,[Preprint] Why is the State of Neural Network Pruning so Confusing? On the Fairness, Comparison Setup, and Trainability in Network Pruning
User: mingsun-tse
sparsity,Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
Organization: neuralmagic
sparsity,Zero-label image classification via OpenCLIP knowledge distillation
Organization: nvidia-ai-iot
sparsity,Iterative hard thresholding for l0 penalized regression
Organization: openmendel
sparsity,Neural Network Compression Framework for enhanced OpenVINO™ inference
Organization: openvinotoolkit
sparsity,PaddleSlim is an open-source library for deep model compression and architecture search.
Organization: paddlepaddle
Home Page: https://paddleslim.readthedocs.io/zh_CN/latest/
sparsity,Implementation of the sparse icp algorithm
User: palanglois
sparsity,Robust Single Sample Face Recognition by Sparsity-Driven Sub-Dictionary Learning Using Deep Features
Organization: phuselab
Home Page: https://www.mdpi.com/1424-8220/19/1/146
sparsity,MATLAB implementation of gait cycle validation and segmentation using inertial sensors.
User: prateekgv
sparsity,A Library for Denoising Single-Cell Data with Random Matrix Theory
Organization: rabadanlab
sparsity,Soft Threshold Weight Reparameterization for Learnable Sparsity
Organization: raivnlab
Home Page: https://homes.cs.washington.edu/~kusupati/#Kusupati20
sparsity,Codes and data coming with article "A Survey and an Extensive Evaluation of Popular Audio Declipping Methods", and others closely related
User: rajmic
sparsity,WIP. Veloce is a low-code Ray-based parallelization library that makes machine learning computation novel, efficient, and heterogeneous.
User: ryantd
sparsity,Model Compression/Inference Made Easy
User: satabios
Home Page: https://sconce.readthedocs.io/en/latest/
sparsity,[ICML 2023] UPop: Unified and Progressive Pruning for Compressing Vision-Language Transformers.
User: sdc17
Home Page: https://dachuanshi.com/UPop-Project/
sparsity,Contains a wide-ranging collection of compressed sensing and feature selection algorithms. Examples include matching pursuit algorithms, forward and backward stepwise regression, sparse Bayesian learning, and basis pursuit.
User: sebastianament
sparsity,A package for AFM image reconstruction and compressed sensing in general
Organization: sip-aau
Home Page: http://magni.readthedocs.io
sparsity,A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
Organization: tensorflow
Home Page: https://www.tensorflow.org/model_optimization
sparsity,[CVPR 2021] Exploring Sparsity in Image Super-Resolution for Efficient Inference
Organization: the-learning-and-vision-atelier-lava
sparsity,Ordered Weighted L1 regularization for classification and regression in Python
User: vene
sparsity,Sparse and structured neural attention mechanisms
User: vene
sparsity,[CVPR 2022] "Quarantine: Sparsity Can Uncover the Trojan Attack Trigger for Free" by Tianlong Chen*, Zhenyu Zhang*, Yihua Zhang*, Shiyu Chang, Sijia Liu, and Zhangyang Wang
Organization: vita-group
sparsity,[ICLR 2023] "Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!" Shiwei Liu, Tianlong Chen, Zhenyu Zhang, Xuxi Chen, Tianjin Huang, AJAY KUMAR JAISWAL, Zhangyang Wang
Organization: vita-group
Home Page: https://openreview.net/forum?id=J6F3lLg4Kdp
sparsity,[ICLR 2022] "Sparsity Winning Twice: Better Robust Generalization from More Efficient Training" by Tianlong Chen*, Zhenyu Zhang*, Pengjun Wang*, Santosh Balachandra*, Haoyu Ma*, Zehao Wang, Zhangyang Wang
Organization: vita-group
sparsity,[ICML2022] Training Your Sparse Neural Network Better with Any Mask. Ajay Jaiswal, Haoyu Ma, Tianlong Chen, ying Ding, and Zhangyang Wang
Organization: vita-group
sparsity,Image Denoising Codes using STROLLR learning, the Matlab implementation of the paper in ICASSP2017
User: wenbihan
sparsity,Caffe for Sparse and Low-rank Deep Neural Networks
User: wenwei202
sparsity,Sparse Recurrent Neural Networks -- Pruning Connections and Hidden Sizes (TensorFlow)
User: wenwei202
sparsity,[Machine Learning Journal (ECML-PKDD 2022 journal track)] Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders
User: zahraatashgahi
Home Page: https://arxiv.org/abs/2012.00560
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.