Skip to content

Commit

Permalink
edit readme.md
Browse files Browse the repository at this point in the history
  • Loading branch information
王嘉宁 authored and 王嘉宁 committed Apr 6, 2023
1 parent 8651c6c commit b2d9e0b
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -125,8 +125,8 @@ We demonstrate all pre-built applications in HugNLP. You can choose one applicat
| | run_clue_cmrc.sh | **Goal**: Standard **Fine-tuning** for CLUE CMRC2018 task. <br> **Path**: applications/benchmark/cluemrc | BERT, RoBERTa, DeBERTa | |
| | run_clue_c3.sh | **Goal**: Standard **Fine-tuning** for CLUE C3 task. <br> **Path**: applications/benchmark/cluemrc | BERT, RoBERTa, DeBERTa | |
| | run_clue_chid.sh | **Goal**: Standard **Fine-tuning** for CLUE CHID task. <br> **Path**: applications/benchmark/cluemrc | BERT, RoBERTa, DeBERTa | |
| **Instruction-Prompting** | run_causal_instruction.sh | **Goal**: **Cross-task training** via generative Instruction-tuning based on causal PLM. <font color='red'>**You can use it to train a small ChatGPT**</font>. <br> **Path**: applications/instruction_prompting/instruction_tuning | GPT2 | [click](./instruction_prompting/generative_instruction_tuning.md) |
| | run_zh_extract_instruction.sh | **Goal**: **Cross-task training** via extractive Instruction-tuning based on Global Pointer model. <br> **Path**: applications/instruction_prompting/chinese_instruction | BERT, RoBERTa, DeBERTa | [click](./documents/instruction_prompting/instruction_tuning.md) |
| **Instruction-Prompting** | run_causal_instruction.sh | **Goal**: **Cross-task training** via generative Instruction-tuning based on causal PLM. <font color='red'>**You can use it to train a small ChatGPT**</font>. <br> **Path**: applications/instruction_prompting/instruction_tuning | GPT2 | [click](./documents/instruction_prompting/generative_instruction_tuning.md) |
| | run_zh_extract_instruction.sh | **Goal**: **Cross-task training** via extractive Instruction-tuning based on Global Pointer model. <br> **Path**: applications/instruction_prompting/chinese_instruction | BERT, RoBERTa, DeBERTa | [click](./documents/instruction_prompting/extractive_instruction_tuning.md) |
| | run_causal_incontext_cls.sh | **Goal**: **In-context learning** for user-defined classification tasks. <br> **Path**: applications/instruction_prompting/incontext_learning | GPT-2 | [click](./documents/instruction_prompting/incontext_learning_for_cls.md) |
| **Information Extraction** | run_extractive_unified_ie.sh | **Goal**: **HugIE**: training a unified chinese information extraction via extractive instruction-tuning. <br> **Path**: applications/information_extraction/HugIE | BERT, RoBERTa, DeBERTa | [click](./documents/information_extraction/HugIE.md) |
| | api_test.py | **Goal**: HugIE: API test. <br> **Path**: applications/information_extraction/HugIE | - | [click](./documents/information_extraction/HugIE.md) |
Expand Down
2 changes: 1 addition & 1 deletion documents/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ We demonstrate all pre-built applications in HugNLP.
| | run_clue_c3.sh | **Goal**: Standard **Fine-tuning** for CLUE C3 task. <br> **Path**: applications/benchmark/cluemrc | BERT, RoBERTa, DeBERTa | |
| | run_clue_chid.sh | **Goal**: Standard **Fine-tuning** for CLUE CHID task. <br> **Path**: applications/benchmark/cluemrc | BERT, RoBERTa, DeBERTa | |
| **Instruction-Prompting** | run_causal_instruction.sh | **Goal**: **Cross-task training** via generative Instruction-tuning based on causal PLM. <font color='red'>**You can use it to train a small ChatGPT**</font>. <br> **Path**: applications/instruction_prompting/instruction_tuning | GPT2 | [click](./instruction_prompting/generative_instruction_tuning.md) |
| | run_zh_extract_instruction.sh | **Goal**: **Cross-task training** via extractive Instruction-tuning based on Global Pointer model. <br> **Path**: applications/instruction_prompting/chinese_instruction | BERT, RoBERTa, DeBERTa | [click](./instruction_prompting/instruction_tuning.md) |
| | run_zh_extract_instruction.sh | **Goal**: **Cross-task training** via extractive Instruction-tuning based on Global Pointer model. <br> **Path**: applications/instruction_prompting/chinese_instruction | BERT, RoBERTa, DeBERTa | [click](./instruction_prompting/extractive_instruction_tuning.md) |
| | run_causal_incontext_cls.sh | **Goal**: **In-context learning** for user-defined classification tasks. <br> **Path**: applications/instruction_prompting/incontext_learning | GPT-2 | [click](./instruction_prompting/incontext_learning_for_cls.md) |
| **Information Extraction** | run_extractive_unified_ie.sh | **Goal**: **HugIE**: training a unified chinese information extraction via extractive instruction-tuning. <br> **Path**: applications/information_extraction/HugIE | BERT, RoBERTa, DeBERTa | [click](./information_extraction/HugIE.md) |
| | api_test.py | **Goal**: HugIE: API test. <br> **Path**: applications/information_extraction/HugIE | - | [click](./information_extraction/HugIE.md) |
Expand Down

0 comments on commit b2d9e0b

Please sign in to comment.