Skip to content

Commit

Permalink
fix lang style
Browse files Browse the repository at this point in the history
  • Loading branch information
ymcui committed Aug 1, 2019
1 parent 96d4bc0 commit 28b24f0
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
## 中文预训练BERT-wwm(Pre-Trained Chinese BERT with Whole Word Masking)
[**中文说明**](https://github.com/ymcui/Chinese-BERT-wwm/) | [**English**](https://github.com/ymcui/Chinese-BERT-wwm/blob/master/README_EN.md)

**For English description, please read [README_EN.md](https://github.com/ymcui/Chinese-BERT-wwm/blob/master/README_EN.md) or our technical report on arXiv: https://arxiv.org/abs/1906.08101**
## 中文预训练BERT-wwm(Pre-Trained Chinese BERT with Whole Word Masking)

为了进一步促进中文自然语言处理的研究发展我们提供了基于全词遮掩Whole Word Masking技术的中文预训练模型BERT-wwm
同时在我们的技术报告中详细对比了当今流行的中文预训练模型:[BERT](https://github.com/google-research/bert)、[ERNIE](https://github.com/PaddlePaddle/LARK/tree/develop/ERNIE)、[BERT-wwm](https://github.com/ymcui/Chinese-BERT-wwm)。
Expand Down
2 changes: 2 additions & 0 deletions README_EN.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[**中文说明**](https://github.com/ymcui/Chinese-BERT-wwm/) | [**English**](https://github.com/ymcui/Chinese-BERT-wwm/blob/master/README_EN.md)

## Chinese BERT with Whole Word Masking
For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**. Meanwhile, we also compare the state-of-the-art Chinese pre-trained models in depth, including [BERT](https://github.com/google-research/bert)、[ERNIE](https://github.com/PaddlePaddle/LARK/tree/develop/ERNIE)、[BERT-wwm](https://github.com/ymcui/Chinese-BERT-wwm)

Expand Down

0 comments on commit 28b24f0

Please sign in to comment.