Evaluation Codes and Pre-trained models of WACV2024 paper ''Improving the Fairness of the Min-Max Game in GANs Training''
We have provided the pre-trained IGGAN models on CIFAR-10 and CIFAR-100 datasets for better obtaining the results we reported in the paper. The code of this module is built by ourselves based on the test codes of the DiffAug-GAN [link] and NDA-GAN [link].
PyTorch version==1.7.1.
TensorFlow 1.14 or 1.15 with GPU support (for IS and FID calculation).
Other python libs in DiffAug-GAN [link] and NDA-GAN [link].
Pre-trained IGGAN on Unconditional CIFAR-10 dataset [link]
Pre-trained IGGAN on Conditional CIFAR-10 dataset [link]
Pre-trained IGGAN on Conditional 10% CIFAR-10 dataset [link]
Pre-trained IGGAN on Unconditional CIFAR-100 dataset [link]
Pre-trained IGGAN on Conditional CIFAR-100 dataset [link]
Pre-trained IGGAN on Conditional 10% CIFAR-100 dataset [link]
To evaluate a pre-trained IGGAN model on Unconditional CIFAR-10, run the following command:
python eval.py --dataset=C10U --network='./pretrained/IGGAN_C10U.pth'
@inproceedings{zhang2024improving,
title={Improving the Fairness of the Min-Max Game in GANs Training},
author={Zhang, Zhaoyu and Hua, Yang and Wang, Hui and McLoone, Se{\'a}n},
booktitle={Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision},
pages={2910--2919},
year={2024}
}