This repository contains the source code for our papers "Joint Attention Strategies for Paraphrase generation (NLDB'19)" and "Every Layer Counts: Multi-layer Multi-head Attention for Neural Machine Translation (under review @ PBML)"
Tensorflow 2.0 tensor2tensor
The Script below runs the MLMHA model on the IWSLT En-VI task. The preprocessed files are in the dataset folder. ./trainscript_mlmha_envi.sh
Full source code will be made avaible soon