I may be slow to respond.
Stars
The code for ACL2020 paper "Attend, Translate and Summarize: An Efficient Method for Neural Cross-Lingual Summarization"
Datasets for EMNLP-IJCNLP 2019 paper "NCLS:Neural Cross-Lingual Summarization"
An Implementation of Transformer (Attention Is All You Need) in DyNet