Git Product home page Git Product logo

nmt-keras's Introduction

NMT-Keras

Documentation Build Status Codacy Badge Requirements Status license

Neural Machine Translation with Keras (Theano and Tensorflow).

Library documentation: nmt-keras.readthedocs.io





If you use this toolkit in your research, please cite:

@misc{nmt-keras2017,
	author = {Peris, {\'A}lvaro},
	title = {{NMT}-{K}eras},
	year = {2017},
	publisher = {GitHub},
	note = {GitHub repository},
	howpublished = {\url{https://github.com/lvapeab/nmt-keras}},
}

Features (in addition to the full Keras cosmos):

  • Tensorboard integration.
  • Online learning and Interactive neural machine translation (INMT). See the interactive NMT branch.
  • Attention model over the input sequence of annotations.
  • Peeked decoder: The previously generated word is an input of the current timestep.
  • Beam search decoding.
  • Ensemble decoding (sample_ensemble.py).
    • Featuring length and source coverage normalization (reference).
  • Translation scoring (score.py).
  • Support for GRU/LSTM networks:
    • Regular GRU/LSTM units.
    • Conditional GRU/LSTM units in the decoder.
    • Multilayered residual GRU/LSTM networks (and their Conditional version).
  • N-best list generation (as byproduct of the beam search process).
  • Unknown words replacement (see Section 3.3 from this paper)
  • Use of pretrained (Glove or Word2Vec) word embedding vectors.
  • MLPs for initializing the RNN hidden and memory state.
  • Spearmint wrapper for hyperparameter optimization.
  • Client-server architecture for web demos:

Installation

Assuming that you have pip installed, run:

git clone https://github.com/lvapeab/nmt-keras
cd nmt-keras
pip install -r requirements.txt

for obtaining the required packages for running this library.

Requirements

NMT-Keras requires the following libraries:

Usage

Training

  1. Set a training configuration in the config.py script. Each parameter is commented. See the documentation file for further info about each specific hyperparameter. You can also specify the parameters when calling the main.py script following the syntax Key=Value

  2. Train!:

python main.py

Decoding

Once we have our model trained, we can translate new text using the sample_ensemble.py script. Please refer to the ensembling_tutorial for more details about this script. In short, if we want to use the models from the first three epochs to translate the examples/EuTrans/test.en file, just run:

 python sample_ensemble.py 
             --models trained_models/tutorial_model/epoch_1 \ 
                      trained_models/tutorial_model/epoch_2 \
             --dataset datasets/Dataset_tutorial_dataset.pkl \
             --text examples/EuTrans/test.en

Scoring

The score.py script can be used to obtain the (-log)probabilities of a parallel corpus. Its syntax is the following:

python score.py --help
usage: Use several translation models for scoring source--target pairs
       [-h] -ds DATASET [-src SOURCE] [-trg TARGET] [-s SPLITS [SPLITS ...]]
       [-d DEST] [-v] [-c CONFIG] --models MODELS [MODELS ...]
optional arguments:
    -h, --help            show this help message and exit
    -ds DATASET, --dataset DATASET
                            Dataset instance with data
    -src SOURCE, --source SOURCE
                            Text file with source sentences
    -trg TARGET, --target TARGET
                            Text file with target sentences
    -s SPLITS [SPLITS ...], --splits SPLITS [SPLITS ...]
                            Splits to sample. Should be already includedinto the
                            dataset object.
    -d DEST, --dest DEST  File to save scores in
    -v, --verbose         Be verbose
    -c CONFIG, --config CONFIG
                            Config pkl for loading the model configuration. If not
                            specified, hyperparameters are read from config.py
    --models MODELS [MODELS ...]
                            path to the models

Advanced features

Other features such as online learning or interactive NMT protocols are implemented in the interactiveNMT branch.

Resources

  • In examples/documentation/neural_machine_translation.pdf you'll find an overview of an attentional NMT system.

  • In the examples folder you'll find some tutorials for running this library. They are expected to be followed in order:

    1. Dataset set up: Shows how to invoke and configure a Dataset instance for a translation problem.

    2. Training tutorial: Shows how to call a translation model, link it with the dataset object and construct calllbacks for monitorizing the training.

    3. Decoding tutorial: Shows how to call a trained translation model and use it to translate new text.

    4. NMT model tutorial: Shows how to build a state-of-the-art NMT model with Keras in few (~50) lines.

Acknowledgement

Much of this library has been developed together with Marc Bolaños (web page) for other sequence-to-sequence problems.

To see other projects following the philosophy of NMT-Keras, take a look here:

TMA for egocentric captioning based on temporally-linked sequences.

VIBIKNet for visual question answering.

ABiViRNet for video description.

Sentence SelectioNN for sentence classification and selection.

Contact

Álvaro Peris (web page): [email protected]

nmt-keras's People

Contributors

lvapeab avatar

Watchers

Shyamal Suhana Chandra avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.