forked from facebookresearch/ParlAI
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
fix examples and add readme for them (facebookresearch#39)
- Loading branch information
1 parent
9ba405c
commit c9fc234
Showing
17 changed files
with
91 additions
and
391 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,55 @@ | ||
# ParlAI examples | ||
|
||
This directory contains a few particular examples of basic loops. | ||
|
||
- base_train.py: _very simple example shows the outline of a training/validation loop using the default Agent parent class_ | ||
- display_data.py: _uses agent.repeat_label to display data from a particular task provided on the command-line_ | ||
- display_model.py: _shows the predictions of a provided model on a particular task provided on the command-line_ | ||
- eval_model.py: _uses the named agent to compute evaluation metrics data for a particular task provided on the command-line_ | ||
- build_dict.py: _build a dictionary from a particular task provided on the command-line using core.dict.DictionaryAgent_ | ||
- memnn_luatorch_cpu: _shows a few examples of training an end-to-end memory network on a few datasets_ | ||
- drqa: _shows how to train the attentive LSTM DrQA model of [Chen et al.](https://arxiv.org/abs/1704.00051) on SQuAD._ | ||
|
||
## Running These Examples | ||
|
||
Most of them can be run simply by typing `python {example}.py -t {task_name}`. Here are some examples: | ||
|
||
Display 10 random examples from task 1 of the "1k training examples" bAbI task: | ||
```bash | ||
python display_data.py -t babi:task1k:1 | ||
``` | ||
|
||
Run a train/valid loop with the basic agent (which prints what it receives and then says hello to the teacher, rather than learning anything) on the babi task: | ||
```bash | ||
python base_train.py -t babi:task1k:1 | ||
``` | ||
|
||
Displays 100 random examples from multi-tasking on the bAbI task and the SQuAD dataset at the same time: | ||
```bash | ||
python display_data.py -t babi:task1k:1,squad -n 100 | ||
``` | ||
|
||
Evaluate an IR baseline model on the validation set of the Movies Subreddit dataset: | ||
```bash | ||
python eval_model.py -m ir_baseline -t "#moviedd-reddit" -dt valid | ||
``` | ||
|
||
Display the predictions of that same IR baseline model: | ||
```bash | ||
python display_model.py -m ir_baseline -t "#moviedd-reddit" -dt valid | ||
``` | ||
|
||
Build a dictionary on a bAbI "1k training examples" task 1 and save it to /tmp/dict.tsv | ||
```bash | ||
python build_dict.py -t babi:task1k:1 --dict-savepath /tmp/dict.tsv | ||
``` | ||
|
||
Train a simple cpu-based memory network on the "10k training examples" bAbI task 1 with 8 threads (python processes) using Hogwild (requires zmq and Lua Torch): | ||
```bash | ||
python memnn_luatorch_cpu/full_task_train.py -t babi:task10k:1 -n 8 | ||
``` | ||
|
||
Trains an attentive LSTM model on the SQuAD dataset with a batch size of 32 examples (pytorch and regex): | ||
```bash | ||
python drqa/train.py -t squad -b 32 | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
File renamed without changes.
Oops, something went wrong.