Skip to content

Commit

Permalink
update gcp link to google drive
Browse files Browse the repository at this point in the history
  • Loading branch information
ymcui committed Jun 25, 2019
1 parent 43ef67b commit 31315a5
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 6 deletions.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,18 +46,18 @@


## 中文模型下载
* [**`BERT-base, Chinese (Whole Word Masking)`**](https://storage.googleapis.com/hfl-rc/chinese-bert/chinese_wwm_L-12_H-768_A-12.zip):
* [**`BERT-base, Chinese (Whole Word Masking)`**](https://drive.google.com/open?id=1RoTQsXp2hkQ1gSRVylRIJfQxJUgkfJMW):
12-layer, 768-hidden, 12-heads, 110M parameters

#### TensorFlow版本(1.12、1.13、1.14测试通过)
- Google: [download_link_for_google_storage](https://storage.googleapis.com/hfl-rc/chinese-bert/chinese_wwm_L-12_H-768_A-12.zip)
- Google: [download_link_for_google_storage](https://drive.google.com/open?id=1RoTQsXp2hkQ1gSRVylRIJfQxJUgkfJMW)
- 讯飞云: [download_link_密码mva8](https://pan.iflytek.com:443/link/4B172939D5748FB1A3881772BC97A898)

#### PyTorch版本(请使用🤗 的[PyTorch-BERT](https://github.com/huggingface/pytorch-pretrained-BERT) > 0.6,其他版本请自行转换)
- Google: [download_link_for_google_storage](https://storage.googleapis.com/hfl-rc/chinese-bert/chinese_wwm_pytorch.zip)
- Google: [download_link_for_google_storage](https://drive.google.com/open?id=1NlMd5GRG97N5BYJHDQR79EU41fEfzMCv)
- 讯飞云: [download_link_密码m1CE](https://pan.iflytek.com:443/link/F23B12B39A3077CF1ED7A08DDAD081E3)

中国大陆境内建议使用讯飞云下载,境外用户建议使用谷歌云下载点,文件大小约**400M**
中国大陆境内建议使用讯飞云下载,境外用户建议使用谷歌下载点,文件大小约**400M**
以TensorFlow版本为例,下载完毕后对zip文件进行解压得到:
```
chinese_wwm_L-12_H-768_A-12.zip
Expand Down
4 changes: 2 additions & 2 deletions README_EN.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,11 +52,11 @@ In this repository, we utilize [Language Technology Platform (LTP)](http://ltp.a
## Download
We mainly provide the pre-trained weights on TensorFlow.

* [**`BERT-base, Chinese (Whole Word Masking)`**](https://storage.googleapis.com/hfl-rc/chinese-bert/chinese_wwm_L-12_H-768_A-12.zip):
* [**`BERT-base, Chinese (Whole Word Masking)`**](https://drive.google.com/open?id=1RoTQsXp2hkQ1gSRVylRIJfQxJUgkfJMW):
12-layer, 768-hidden, 12-heads, 110M parameters

#### PyTorch Version(Please use[PyTorch-BERT by 🤗](https://github.com/huggingface/pytorch-pretrained-BERT) > 0.6, otherwise you need to convert by yourself)
- Google: [download_link_for_google_storage](https://storage.googleapis.com/hfl-rc/chinese-bert/chinese_wwm_pytorch.zip)
- Google: [download_link_for_google_storage](https://drive.google.com/open?id=1NlMd5GRG97N5BYJHDQR79EU41fEfzMCv)

The whole zip package roughly takes ~400M.
ZIP package (TensorFlow version) includes the following files:
Expand Down

0 comments on commit 31315a5

Please sign in to comment.