Skip to content
/ MLMHA Public

This repository contains the source code for our papers "Joint Attention Strategies for Paraphrase generation" and "Every Layer Counts: Multi-layer Multi-head Attention for Neural Machine Translation"

Notifications You must be signed in to change notification settings

kaeflint/MLMHA

Repository files navigation

MLMHA

This repository contains the source code for our papers "Joint Attention Strategies for Paraphrase generation (NLDB'19)" and "Every Layer Counts: Multi-layer Multi-head Attention for Neural Machine Translation (under review @ PBML)"

Requirements

Tensorflow 2.0 tensor2tensor

Sample

The Script below runs the MLMHA model on the IWSLT En-VI task. The preprocessed files are in the dataset folder. ./trainscript_mlmha_envi.sh

Full source code will be made avaible soon

About

This repository contains the source code for our papers "Joint Attention Strategies for Paraphrase generation" and "Every Layer Counts: Multi-layer Multi-head Attention for Neural Machine Translation"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published