Skip to content

Commit

Permalink
style fix
Browse files Browse the repository at this point in the history
  • Loading branch information
ymcui committed Jan 21, 2020
1 parent e1ee19d commit 5ce14c1
Showing 1 changed file with 1 addition and 6 deletions.
7 changes: 1 addition & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,11 +8,6 @@

![./pics/header.png](https://github.com/ymcui/Chinese-BERT-wwm/raw/master/pics/header.png)

**微信公众号文章介绍**

- 哈工大讯飞联合实验室https://mp.weixin.qq.com/s/EE6dEhvpKxqnVW_bBAKrnA
- 机器之心https://mp.weixin.qq.com/s/88OwaHqnrVMQ7vH98INA3w

本项目基于谷歌官方的BERThttps://github.com/google-research/bert


Expand Down Expand Up @@ -330,7 +325,7 @@ model = BertModel.from_pretrained("MODEL_NAME")
- RBT3由RoBERTa-wwm-ext 3层进行初始化继续训练了1M步
- RBTL3由RoBERTa-wwm-ext-large 3层进行初始化继续训练了1M步
- RBT的名字是RoBERTa三个音节首字母组成L代表large模型
- 直接使用RoBERTa-wwm-ext-large前三层进行初始化并进行下游任务的训练将显著降低效果例如在CMRC 2018上测试集仅能达到42.9/65.3而RBTL3能达到63.3 / 83.4
- 直接使用RoBERTa-wwm-ext-large前三层进行初始化并进行下游任务的训练将显著降低效果例如在CMRC 2018上测试集仅能达到42.9/65.3而RBTL3能达到63.3/83.4


## 使用建议
Expand Down

0 comments on commit 5ce14c1

Please sign in to comment.