Skip to content

Commit

Permalink
update bib entry
Browse files Browse the repository at this point in the history
  • Loading branch information
ymcui committed Sep 21, 2020
1 parent 963d801 commit e8196c6
Show file tree
Hide file tree
Showing 2 changed files with 28 additions and 23 deletions.
27 changes: 15 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,9 @@ Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping
### 哈工大讯飞联合实验室(HFL)2021提前批校园招聘开始了!欢迎各位[投递简历](https://wj.qq.com/s2/6730642/762d)!

## 新闻
**2020/8/27 哈工大讯飞联合实验室在通用自然语言理解评测GLUE中荣登榜首查看[GLUE榜单](https://gluebenchmark.com/leaderboard),[新闻](http://dwz.date/ckrD)。**
**2020/9/15 我们的论文"Revisiting Pre-Trained Models for Chinese Natural Language Processing"[Findings of EMNLP](https://2020.emnlp.org)录用为长文**

2020/8/27 哈工大讯飞联合实验室在通用自然语言理解评测GLUE中荣登榜首查看[GLUE榜单](https://gluebenchmark.com/leaderboard),[新闻](http://dwz.date/ckrD)。

2020/3/23 本目录发布的模型已接入[飞桨PaddleHub](https://github.com/PaddlePaddle/PaddleHub),查看[快速加载](#快速加载)

Expand Down Expand Up @@ -449,8 +451,18 @@ A: 我们集成了RoBERTa和BERT-wwm的优点,对两者进行了一个自然


## 引用
如果本目录中的内容对你的研究工作有所帮助欢迎在论文中引用下述技术报告
https://arxiv.org/abs/1906.08101
如果本目录中的内容对你的研究工作有所帮助欢迎在论文中引用下述技术报告
- 首选https://arxiv.org/abs/2004.13922
```
@inproceedings{cui-etal-2020-revisiting,
title={Revisiting Pre-Trained Models for Chinese Natural Language Processing},
author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Wang, Shijin and Hu, Guoping},
booktitle = "Findings of EMNLP",
year = "2020",
publisher = "Association for Computational Linguistics"
}
```
- 备选https://arxiv.org/abs/1906.08101
```
@article{chinese-bert-wwm,
title={Pre-Training with Whole Word Masking for Chinese BERT},
Expand All @@ -460,15 +472,6 @@ https://arxiv.org/abs/1906.08101
}
```

https://arxiv.org/abs/2004.13922
```
@article{cui-2020-revisiting,
title={Revisiting Pre-Trained Models for Chinese Natural Language Processing},
author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Wang, Shijin and Hu, Guoping},
journal={arXiv preprint arXiv:2004.13922},
year={2020}
}
```

## 致谢
第一作者部分受到[**谷歌TensorFlow Research Cloud**](https://www.tensorflow.org/tfrc)计划资助
Expand Down
24 changes: 13 additions & 11 deletions README_EN.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@ More resources by HFL: https://github.com/ymcui/HFL-Anthology


## News
**2020/9/15 Our paper "Revisiting Pre-Trained Models for Chinese Natural Language Processing" is accepted to [Findings of EMNLP](https://2020.emnlp.org) as a long paper.**

2020/8/27 We are happy to announce that our model is on top of GLUE benchmark, check [leaderboard](https://gluebenchmark.com/leaderboard).

2020/3/23 The models in this repository now can be easily accessed through [PaddleHub](https://github.com/PaddlePaddle/PaddleHub), check [Quick Load](#Quick-Load)
Expand Down Expand Up @@ -414,7 +416,17 @@ A: integrate whole word masking (wwm) into RoBERTa model, specifically:

## Citation
If you find the technical report or resource is useful, please cite the following technical report in your paper.
https://arxiv.org/abs/1906.08101
- Primary: https://arxiv.org/abs/2004.13922
```
@inproceedings{cui-etal-2020-revisiting,
title={Revisiting Pre-Trained Models for Chinese Natural Language Processing},
author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Wang, Shijin and Hu, Guoping},
booktitle = "Findings of EMNLP",
year = "2020",
publisher = "Association for Computational Linguistics"
}
```
- Secondary: https://arxiv.org/abs/1906.08101
```
@article{chinese-bert-wwm,
title={Pre-Training with Whole Word Masking for Chinese BERT},
Expand All @@ -424,16 +436,6 @@ https://arxiv.org/abs/1906.08101
}
```

or https://arxiv.org/abs/2004.13922
```
@article{cui-2020-revisiting,
title={Revisiting Pre-Trained Models for Chinese Natural Language Processing},
author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Wang, Shijin and Hu, Guoping},
journal={arXiv preprint arXiv:2004.13922},
year={2020}
}
```

## Disclaimer
**This is NOT a project by Google official. Also, this is NOT an official product by HIT and iFLYTEK.**
The experiments only represent the empirical results in certain conditions and should not be regarded as the nature of the respective models. The results may vary using different random seeds, computing devices, etc.
Expand Down

0 comments on commit e8196c6

Please sign in to comment.