First of all, wonderful Implementation of "Progressive Distillation for Fast Sampling of Diffusion Models".
However the link to the pre-trained weights in diffusion_distiller (https://cloud.mail.ru/public/mQGz/k1pNzg2ng) seems cannot be reached in China.
I' m wondering if there is any other ways to share these weights, like through Google Drive maybe?
Hello @Hramchenko ! Thank you for your work on this repo.
I am trying to implement the distillation for v-parametrization and used your code. I am having an issue with this line, because sigma_s can be very small (ex: 1e-2) which causes eps_2 to become very large and thus v_2 too, while it should remain close to the [-1,1] range.
Would you have any insights on that ? Did you derive the equations yourself ?
Hi! Thanks so much for the implementation. I have one question about the quality of the generated samples though: why do the generated samples seem "distorted" in some way? The original model trained here(https://github.com/rosinality/denoising-diffusion-pytorch) do not have this problem. Is it from the distiller or the something wrong with training the original model?
Also, I'm insterested in the gpu memory and training time required for the distiller. Thanks a lot for your help!