Git Product home page Git Product logo

Comments (6)

lmxyy avatar lmxyy commented on August 16, 2024 1

thx for your reply!!
btw, I should run distill.py then train_supernet.py , or I just run train_supernet.py after training the teacher model. And what is the difference between the two process.

I think the distillation will work, but you may need to add some codes to specify the distillation layers for the mobilenet or unet architecture.

If you need to reproduce our results, you may need to run distill.py then train_supernet.py. The distill.py can make the model compact before the "once-for-all" network training.
As a matter of fact, you could directly run train_supernet.py after training a teacher model if you change the candidate subnet set in resnet_configs. We will release a lite pipeline later which will skip the distillation process.

from gan-compression.

lmxyy avatar lmxyy commented on August 16, 2024

Sorry, I do not quite understand your question. Currently, our codebase only supports ResNet generator because the generators in pix2pix and CycleGAN are in ResNet architecture (on pix2pix, we observe that the ResNet generator is better than the UNet generator, refer to
Network architecture for pix2pix in Appendix 6.2). We do not modify the backbone of the original generator.

from gan-compression.

leonardodora avatar leonardodora commented on August 16, 2024

Sorry, I do not quite understand your question. Currently, our codebase only supports ResNet generator because the generators in pix2pix and CycleGAN are in ResNet architecture (on pix2pix, we observe that the ResNet generator is better than the UNet generator, refer to
Network architecture for pix2pix in Appendix 6.2). We do not modify the backbone of the original generator.

thanks for your reply!
I want to know your code base is compatible with different architecture. That is if I train my teacher model on mobilenet or unet instead of resnet, would the distill work.

from gan-compression.

lmxyy avatar lmxyy commented on August 16, 2024

I think the distillation will work, but you may need to add some codes to specify the distillation layers for the mobilenet or unet architecture.

from gan-compression.

junyanz avatar junyanz commented on August 16, 2024

I think the mapping_layers defines the distillation layers (see this line). We should try to make it as a flag and make it also applicable to other architectures.

from gan-compression.

leonardodora avatar leonardodora commented on August 16, 2024

thx for your reply!!
btw, I should run distill.py then train_supernet.py , or I just run train_supernet.py after training the teacher model. And what is the difference between the two process.

I think the distillation will work, but you may need to add some codes to specify the distillation layers for the mobilenet or unet architecture.

from gan-compression.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.