English | 简体中文
This folder stores the pretraining model, such as BERT.
Chinese BERT recommends to use the pretraining model of Chinese full word masking released by Harbin Institute of Technology BERT wwm It also provides Xunfeiyun download, which is very fast in China. See the above website for specific links.
There must be three files in the folder:
- config.json
Configuration file of BERT model structure
- pytorch_model.bin
Pretraining model parameters
- vocab.txt
BERT vocabulary