This repository contains the implementation of the paper: "POINTER: Constrained Text Generation via Insertion-based Generative Pre-training", a progressive and non-autoregressive text generation pre-training approach.
Figure: Illustration of the generation process (blue arrow) of the proposed POINTER model. At each stage, the module generates either a
or a special
NOI
token
for each gap between two existing tokens . The gener- ation stops when all the gaps predict NOI
. The data preparation process (orange arrow) reverses the above gen- erative process.
The live demo can be found at here. Please expect delay and crash as it is running on a single GPU machine.
If you use this code in your research, you can cite our paper:
@article{zhang2020pointer,
title={POINTER: Constrained Text Generation via Insertion-based Generative Pre-training},
author={Zhang, Yizhe and Wang, Guoyin and Li, Chunyuan and Gan, Zhe and Brockett, Chris and Dolan, Bill},
journal={arXiv preprint arXiv:2005.00558},
year={2020}
}