Git Product home page Git Product logo

lnfmm's Introduction

Introduction

This repository is the PyTorch implementation of the paper:

Latent Normalizing Flows for Many-to-Many Cross Domain Mappings (ICLR 2020)

Shweta Mahajan, Iryna Gurevych, Stefan Roth

Getting started

This code has been developed under Python 3.5, Pytorch 1.0.0 and CUDA 10.0.

  1. Please run requirements.py to check if all required packages are installed.
  2. The dataset used in this project is COCO 2014. The dataset is available here.

Training

The script train.py is used for training. The parameters are listed in params.json. Note that there are two different configurations for best performance on the image captioning and text-to-image synthesis tasks.

Example usage to train a model on COCO 2014 for captioning is,

python train.py --config params_i2t

Example usage to train a model on COCO 2014 for text-to-image synthesis task is,

python train.py --config params_t2i

Note that for training CUDA 10.0 and GPU devices are required. The number of GPUs used can be set in params.json. Also note that we use 1 Nvidia Volta V100 GPU and 3 Nvidia Volta V100 GPUs with 32GB for the captioning and text-to-image synthetis tasks respectively.

Generation and Validation

For evalutaion we use the following repos,

  1. Oracle - We use the version of pycocoeval cap which supports Python 3 available here.
  2. Concensus Reranking - We use the repo of mRNN-CR.
  3. Diversity - We use the repo of DiversityMetrics (requires Python 2.7).

Checkpoints are available for text-to-image synthesis and for image captioning.

Bibtex

@inproceedings{mahajan2020latent,
title = {Latent Normalizing Flows for Many-to-Many Cross-Domain Mappings},
author = {Mahajan, Shweta and Gurevych, Iryna and Roth, Stefan},
booktitle = {International Conference on Learning Representations},
year = {2020},
}

lnfmm's People

Contributors

s-mahajan avatar sroth-visinf avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

lnfmm's Issues

Does reducing batch size affects convergence?

Due to a limit of 11GB GPU resource, I am forced to reduce the batch size for image2text to 8 instead of 128.
After training for a while, I was unable to get a converged trained model that replicated results in the paper.

Hence, my question is:

  1. How much GPU memory did you use?
  2. Does using smaller batch size leads to failure in training?

Error while Loading Released Model Weights

Hi @sroth-visinf ,

Thanks for the implementation of the excellent work. I ran into the error RuntimeError: unexpected EOF, expected 170260432 more bytes. The file might be corrupted. while loading the released pre-trained model weight. It seems that the checkpoint file is corrupted and torch.load() raises the error. Could you please check or re-upload the model weights. Thank you very much!

Regards,
Chang

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.