Skip to content

Commit

Permalink
update banner
Browse files Browse the repository at this point in the history
  • Loading branch information
ymcui committed Mar 6, 2020
1 parent 3013846 commit 7f2ee29
Showing 1 changed file with 10 additions and 6 deletions.
16 changes: 10 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,18 @@
[**中文说明**](https://github.com/ymcui/Chinese-BERT-wwm/) | [**English**](https://github.com/ymcui/Chinese-BERT-wwm/blob/master/README_EN.md)

## 中文预训练BERT-wwm(Pre-Trained Chinese BERT with Whole Word Masking)
在自然语言处理领域中预训练模型Pre-trained Models已成为非常重要的基础技术
为了进一步促进中文信息处理的研究发展我们发布了基于全词遮罩Whole Word Masking技术的中文预训练模型BERT-wwm以及与此技术密切相关的模型BERT-wwm-extRoBERTa-wwm-extRoBERTa-wwm-ext-large
<p align="center">
<br>
<img src="./pics/banner.png" width="500"/>
<br>
</p>

**更多细节请参考我们的技术报告https://arxiv.org/abs/1906.08101**
在自然语言处理领域中预训练模型Pre-trained Models已成为非常重要的基础技术
为了进一步促进中文信息处理的研究发展我们发布了基于全词遮罩Whole Word Masking技术的中文预训练模型BERT-wwm以及与此技术密切相关的模型BERT-wwm-extRoBERTa-wwm-extRoBERTa-wwm-ext-large, RBT3, RBTL3

![./pics/header.png](https://github.com/ymcui/Chinese-BERT-wwm/raw/master/pics/header.png)
**[Pre-Training with Whole Word Masking for Chinese BERT](https://arxiv.org/abs/1906.08101)**
Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping Hu

本项目基于谷歌官方的BERThttps://github.com/google-research/bert
本项目基于谷歌官方BERThttps://github.com/google-research/bert


## 新闻
Expand Down

0 comments on commit 7f2ee29

Please sign in to comment.