Git Product home page Git Product logo

elastic-infogan's Introduction

Elastic-InfoGAN

This repository provides the official PyTorch implementation of Elastic-InfoGAN, which allows disentangling the discrete factors of variation in class-imbalanced data without access to the ground-truth distribution.

Elastic-InfoGAN: Unsupervised Disentangled Representation Learning in Class-Imbalanced Data Utkarsh Ojha, Krishna Kumar Singh, Cho-jui Hsieh, Yong Jae Lee
UC Davis, UCLA, and Adobe Research
In NeurIPS 2020

System requirements

  • Linux
  • Python 2
  • NVIDIA GPU + CUDA CuDNN
  • Python2.7
  • PyTorch 1.3.1
  • Imageio
  • Torchvision 0.4.2
  • Augmentor

Creating the dataset

  • Get the original MNIST dataset from this link. Move it to the ./splits directory.
  • ./splits/50_data_imbalance.npy contains 50 (random) class-imbalance information.
  • Run bash data.sh to create the 50 imbalanced MNIST datasets, which will be stored in the splits directory.

Elastic-InfoGAN training

  • Train the model on all the 50 random splits: bash run.sh
  • Intermediate generated images (different rows correspond to different discrete latent codes) will be stored in the results directory.
  • Trained models will be stored in the saved_models directory.

Elastic-InfoGAN evaluation

  • The 50 pre-trained generator models, each trained on 50 imbalanced splits respectively, are available at this link.
  • Unzip and extract all the models in the mnist_pretrained, and run bash eval.sh
  • This will compute the Normalized Mutual Information (NMI) and Average Entropy (ENT).

Example results

Imbalanced MNIST

Imbalanced 3D Cars

Imbalanced 3D Chairs

Imbalanced ShapeNet

Imbalanced YouTube-Faces

Citation

If you find our work/code useful in your research, please cite our paper.

@inproceedings{elastic-infogan2020,
  title={Elastic-InfoGAN: Unsupervised Disentangled Representation Learning in Class-Imbalanced Data},
  author={Ojha, Utkarsh and Singh, Krishna Kumar and Hsieh, Cho-Jui and Lee, Yong Jae},
  booktitle={Advances in Neural Information Processing Systems (NeurIPS)},
  year={2020}
}
  • The Gumbel-softmax implementation was taken from this wonderful work by Eric Jang et al.
  • The implementation for Normalized Temperature-Scaled Cross Entropy loss was taken from this repository by Thalles Silva.

For any queries related to this work, please contact Utkarsh Ojha

elastic-infogan's People

Contributors

utkarshojha avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.