Skip to content

Commit

Permalink
update bibtex
Browse files Browse the repository at this point in the history
  • Loading branch information
ymcui committed May 18, 2020
1 parent b14422a commit 0bcd9cc
Show file tree
Hide file tree
Showing 2 changed files with 21 additions and 2 deletions.
11 changes: 10 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -440,7 +440,7 @@ A: 我们集成了RoBERTa和BERT-wwm的优点,对两者进行了一个自然


## 引用
如果本目录中的内容对你的研究工作有所帮助请在文献中引用下述技术报告
如果本目录中的内容对你的研究工作有所帮助欢迎在论文中引用下述技术报告
https://arxiv.org/abs/1906.08101
```
@article{chinese-bert-wwm,
Expand All @@ -451,6 +451,15 @@ https://arxiv.org/abs/1906.08101
}
```

https://arxiv.org/abs/2004.13922
```
@article{cui-2020-revisiting,
title={Revisiting Pre-Trained Models for Chinese Natural Language Processing},
author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Wang, Shijin and Hu, Guoping},
journal={arXiv preprint arXiv:2004.13922},
year={2020}
}
```

## 致谢
第一作者部分受到[**谷歌TensorFlow Research Cloud**](https://www.tensorflow.org/tfrc)计划资助
Expand Down
12 changes: 11 additions & 1 deletion README_EN.md
Original file line number Diff line number Diff line change
Expand Up @@ -401,7 +401,7 @@ A: integrate whole word masking (wwm) into RoBERTa model, specifically:
3) directly use the data generated by `max_len=512` (but not from `max_len=128` for several steps then `max_len=512`)
4) extended training steps (1M steps)

## Reference
## Citation
If you find the technical report or resource is useful, please cite the following technical report in your paper.
https://arxiv.org/abs/1906.08101
```
Expand All @@ -413,6 +413,16 @@ https://arxiv.org/abs/1906.08101
}
```

or https://arxiv.org/abs/2004.13922
```
@article{cui-2020-revisiting,
title={Revisiting Pre-Trained Models for Chinese Natural Language Processing},
author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Wang, Shijin and Hu, Guoping},
journal={arXiv preprint arXiv:2004.13922},
year={2020}
}
```

## Disclaimer
**This is NOT a project by Google official. Also, this is NOT an official product by HIT and iFLYTEK.**
The experiments only represent the empirical results in certain conditions and should not be regarded as the nature of the respective models. The results may vary using different random seeds, computing devices, etc.
Expand Down

0 comments on commit 0bcd9cc

Please sign in to comment.