A clean, and commented GAN implementation for MNIST.
Usage:
python gan2d_train.py -j config.json
As you can tell from the last frame of the gif below, it's definitely not perfect. That said, the GAN does not collapse into reproducing the same image, the generator and disccriminator losses are quite stable, and the generated images do resemble digits.
Takeaways from my experiments training:
- Use tanh as the output activation on the generator
- Train the discriminator with batches containing a single class (ie. only real or only fake images)
- Draw from a distribution on [-1,1] instead of [0,1]
- Reduce the default Keras Adam momentum rate.
- Reduce the learning rate during training
Instead of trying to perfectly generate MNIST, I'm going to take what I've learned and move to more interesting datasets!