Git Product home page Git Product logo

pruning's Introduction

GAN-pruning

A Pytorch implementation for our ICCV 2019 paper, Co-Evolutionary Compression for unpaired image Translation, which proposes a co-evolutionary approach for reducing memory usage and FLOPs of generators on image-to-image transfer task simultaneously while maintains their performances.

Performance

Performance on cityscapes compared with conventional pruning method:

SCOP

A Pytorch implementation for our NeurIPS 2020 paper, SCOP: Scientific Control for Reliable Neural Network Pruning, which proposes a reliable neural network pruning algorithm by setting up a scientific control.

Performance

Comparison of the pruned networks with different methods on ImageNet.

ManiDP

A Pytorch implementation for our CVPR 2021 paper, Manifold Regularized Dynamic Network Pruning, which proposes a dynamic pruning paradigm to maximally excavate network redundancy corresponding to input instances.

Performance

Comparison of the pruned networks with different methods on ImageNet.

pruning's People

Contributors

yehuitang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pruning's Issues

ManiDP loss is nan

No matter how I train, the recognition rate is always 10%, and the loss is always nan. I tried training from scratch and still got this result.why?

Hardware setting and GPU hours of ManiDP

Hi, I'm CNN light-weighting researcher, and I'm building a filter pruning leaderboard for better comparison.
I think GPU hour for search and training is important to evaluate the pruning algorithms, but you didn't report it.
Would you let me know GPU hours for each ImageNet configuration?

Pre-trained model in SCOP

Hello. How to get ResNet56 pre-trained model for Cifar10 in the paper SCOP? I was not able to find it in pytorch hub.

Could `pred_fake = netD_B(fake_B.detach())` be negative

Hello,I was running the search.py,and when I run it on Win10,the value pred_fake = netD_B(fake_B.detach()) was positive and it was negative on ubuntu.
Could pred_fake = netD_B(fake_B.detach()) be negative,is it normal?

About training using resnet18 with imagenet

Hi. Thanks for your work. I am currently running your code training with imagenet. It runs, but when the conducting "validate_ontrain

train_acc_1,train_los=validate_ontrain(train_loader, net, criterion, log)

", the loss becomes enormous.
image

And the loss becomes normal after training the first epoch, but the remain rate you displayed is large even training for hundreds of epochs.
Screenshot 2021-12-27 at 15 50 20

Is it normal? I have loaded the pretrained model

Details on calculating FID score

Hi,
Could you please provide more information on how you calculated the FID score? For example, which code did you use? How many images did you use to find FID score?
Thanks in advance.

Knockoff Generator for Custom Dataset

Hello,
Thank you for your work. It looks amazing and I'd like to try it on some custom dataset.
Could you please share the script with training generator for ImageNet or for another dataset?

ManiDp is not work

After run the command:
python ManiDP/main.py --ngpu=1 --arch=dyresnet56 --dataset=cifar10 --target_remain_rate=0.6 --lambda_lasso 5e-3 --lambda_graph 1.0 --pretrain_path='ManiDP/pretrain_path/' --data_path='...'
the loss is nan.
I think this error is caused by the lasso loss. Can you give some advice. Thanks

Pretrained dense model and pruned model

Hi,

I'm really interested in your work. I wonder could u please release the pretrained dense model and the pruned model? I'm not sure if your training setting is the same as the default one in the code. Thanks!

About MainDP the loss is NaN

I have found that some parts of the mask will extremely increase and become Nan when one sample is input into the MaskBlock, although the clamp_max is set 1000.

No acceleration for ManiDP

Hello. Thanks for your work. I have some confusions about how the Manidp achieve acceleration.
I found that the convolution is forwarded without masks.

out = self.conv1(x)

The mask only make sense after the convolution which simply multiplying the results.
out=out* mask1.unsqueeze(-1).unsqueeze(-1)

In this situation, the FLOPs will always be the addition of original model and gate module, which is slower than a single original model.
So, how the FLOPs reduction is calculated? Can you provide the codes for FLOPs evaluation?

About pre-trained-prune models

Excuse me, could you release the pre-trained-prune models with different super parameter γ(0.1,1,10), such as netG_A2B_prune_200.pth and netG_B2A_prune_200.pth with γ(0.1,1,10)? I want to learn about it, but my computer need train for a long time .Thank you!

About pre-train model

Hello. I am using the ManiDP code, and I want to train with Cifar10. However, I don't know where to find the pre-train model. Could you provide the pre-train model or the code for original training?

train errors

2019-10-23 19-41-37屏幕截图

when i run the search.py code appear the problem as in the picture,what should i set in the problem?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.