Skip to content

Commit

Permalink
add squadv1.1 notebook & update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
airaria committed Jul 8, 2021
1 parent d5e89e3 commit e35332b
Show file tree
Hide file tree
Showing 3 changed files with 895 additions and 883 deletions.
7 changes: 4 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Check our paper through [ACL Anthology](https://www.aclweb.org/anthology/2020.ac
**Jul 8, 2021**

* **New examples with Transformers 4**
* The current examples (exmaples/) have been written with old versions of Transformers and they may cause some confusions and bugs. We rewrite the examples with Transformers 4 in jupyter notebooks, which are easy to follow and learn.
* The current examples (exmaples/) have been written with old versions of Transformers and they may cause some confusions and bugs. We rewrite the examples with Transformers 4 in Jupyter Notebooks, which are easy to follow and learn.
* The new examples can be found at [examples/notebook_examples](examples/notebook_examples/). See [Examples](#examples) for details.

**Mar 1, 2021**
Expand Down Expand Up @@ -265,8 +265,9 @@ with distiller:
### **Examples**

* **Notebook examples with Transformers 4**
* [examples/notebook\_examples/sst2.ipynb](examples/notebook\_examples/sst2.ipynb) (English): Training and distilling BERT on SST-2, an English sentence classification task.
* [examples/notebook\_examples/msra_ner.ipynb](examples/notebook\_examples/msra_ner.ipynb) (Chinese): Training and distilling BERT on MSRA NER, a sequence labeling task.
* [examples/notebook\_examples/sst2.ipynb](examples/notebook\_examples/sst2.ipynb) (English): training and distilling BERT on SST-2, an English sentence classification task.
* [examples/notebook\_examples/msra_ner.ipynb](examples/notebook\_examples/msra_ner.ipynb) (Chinese): training and distilling BERT on MSRA NER, a Chinese sequence labeling task.
* [examples/notebook\_examples/sqaudv1.1.ipynb](examples/notebook\_examples/sqaudv1.1.ipynb) (English): training and distilling BERT on SQuAD 1.1, an English MRC task.

* [examples/random_token_example](examples/random_token_example) : a simple runable toy example which demonstrates the usage of TextBrewer. This example performs distillation on the text classification task with random tokens as inputs.
* [examples/cmrc2018\_example](examples/cmrc2018_example) (Chinese): distillation on CMRC 2018, a Chinese MRC task, using DRCD as data augmentation.
Expand Down
3 changes: 2 additions & 1 deletion README_ZH.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
**Jul 8, 2021**

* **新增Transformers 4示例**
* 目前已有示例基于较早版本的Transformers,使用习惯与当前的Transformers不同。为了避免使用中的困惑与bugs,我们新添了基于Transformers 4的jupyter notebook示例,更易学习与使用。
* 目前已有示例基于较早版本的Transformers,使用习惯与当前的Transformers不同。为了减少使用中的困惑与bugs,我们添加了基于Transformers 4的notebook示例,更易学习与使用。
* 新示例位于[examples/notebook_examples](examples/notebook_examples/)。详情参见[蒸馏任务示例](#蒸馏任务示例)

**Mar 1, 2021**
Expand Down Expand Up @@ -265,6 +265,7 @@ with distiller:
* **Transformers 4示例**
* [examples/notebook\_examples/sst2.ipynb](examples/notebook\_examples/sst2.ipynb) (英文): SST-2文本分类任务上的BERT模型训练与蒸馏。
* [examples/notebook\_examples/msra_ner.ipynb](examples/notebook\_examples/msra_ner.ipynb) (中文): MSRA NER中文命名实体识别任务上的BERT模型训练与蒸馏。
* [examples/notebook\_examples/sqaudv1.1.ipynb](examples/notebook\_examples/sqaudv1.1.ipynb) (英文): SQuAD 1.1英文阅读理解任务上的BERT模型训练与蒸馏。

* [examples/random_token_example](examples/random_token_example): 一个可运行的简单示例,在文本分类任务上以随机文本为输入,演示TextBrewer用法。
* [examples/cmrc2018\_example](examples/cmrc2018_example) (中文): CMRC 2018上的中文阅读理解任务蒸馏,并使用DRCD数据集做数据增强。
Expand Down
Loading

0 comments on commit e35332b

Please sign in to comment.