Skip to content

Commit

Permalink
changing code-block in docs
Browse files Browse the repository at this point in the history
  • Loading branch information
saransh-mehta committed Jun 9, 2020
1 parent 1d79a3b commit 991b649
Show file tree
Hide file tree
Showing 4 changed files with 60 additions and 42 deletions.
4 changes: 3 additions & 1 deletion docs/source/data_transformations.rst
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,9 @@ Running data transformations
Once you have made the :ref:`transform file<Transform File>` with all the transform operations,
you can run data transformations with the following terminal command.

>>> python data_transformations.py \
.. code-block:: console
$ python data_transformations.py \
--transform_file 'transform_file.yml'
Expand Down
14 changes: 10 additions & 4 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -27,10 +27,12 @@ Installation
To use multi-task-NLP, you can clone the repository into the desired location on your system
with the following terminal command.

>>> cd /desired/location/
>>> git clone https://github.com/hellohaptik/multi-task-NLP.git
>>> cd multi-task-NLP
>>> pip install -r requirements.txt
.. code-block:: console
$ cd /desired/location/
$ git clone https://github.com/hellohaptik/multi-task-NLP.git
$ cd multi-task-NLP
$ pip install -r requirements.txt
NOTE:- The library is built and tested using ``Python 3.7.3``. It is recommended to install the requirements in a virtual environment.

Expand All @@ -42,6 +44,10 @@ and with **no requirement to code!!**
.. toctree::
quickstart

Examples
--------


Step by Step Guide
------------------
A complete guide explaining all the components of multi-task-NLP in sequential order.
Expand Down
40 changes: 22 additions & 18 deletions docs/source/quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -49,10 +49,12 @@ Step 2 - Run data preparation

After defining the task file in :ref:`Step 1<Step 1 - Define your task file>`, run the following command to prepare the data.

>>> python data_preparation.py \
--task_file 'sample_task_file.yml' \
--data_dir 'data' \
--max_seq_len 50
.. code-block:: console
$ python data_preparation.py \
--task_file 'sample_task_file.yml' \
--data_dir 'data' \
--max_seq_len 50
For knowing about the ``data_preparation.py`` script and its arguments, refer :ref:`here<Running data preparation>`.

Expand All @@ -61,20 +63,22 @@ Step 3 - Run train

Finally you can start your training using the following command.

>>> python train.py \
--data_dir 'data/bert-base-uncased_prepared_data' \
--task_file 'sample_task_file.yml' \
--out_dir 'sample_out' \
--epochs 5 \
--train_batch_size 4 \
--eval_batch_size 8 \
--grad_accumulation_steps 2 \
--log_per_updates 25 \
--save_per_updates 1000 \
--eval_while_train True \
--test_while_train True \
--max_seq_len 50 \
--silent True
.. code-block:: console
$ python train.py \
--data_dir 'data/bert-base-uncased_prepared_data' \
--task_file 'sample_task_file.yml' \
--out_dir 'sample_out' \
--epochs 5 \
--train_batch_size 4 \
--eval_batch_size 8 \
--grad_accumulation_steps 2 \
--log_per_updates 25 \
--save_per_updates 1000 \
--eval_while_train True \
--test_while_train True \
--max_seq_len 50 \
--silent True
For knowing about the ``train.py`` script and its arguments, refer :ref:`here<Running train>`.

Expand Down
44 changes: 25 additions & 19 deletions docs/source/training.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,10 +25,12 @@ The script takes the following arguments,

You can use the following terminal command with your own argument values to run.

>>> python data_preparation.py \
--task_file 'sample_task_file.yml' \
--data_dir 'data' \
--max_seq_len 50
.. code-block:: console
$ python data_preparation.py \
--task_file 'sample_task_file.yml' \
--data_dir 'data' \
--max_seq_len 50
Running train
-------------
Expand All @@ -54,27 +56,31 @@ available

You can use the following terminal command with your own argument values to run.

>>> python train.py \
--data_dir 'data/bert-base-uncased_prepared_data' \
--task_file 'sample_task_file.yml' \
--out_dir 'sample_out' \
--epochs 5 \
--train_batch_size 4 \
--eval_batch_size 8 \
--grad_accumulation_steps 2 \
--log_per_updates 25 \
--save_per_updates 1000 \
--eval_while_train True \
--test_while_train True \
--max_seq_len 50 \
--silent True
.. code-block:: console
$ python train.py \
--data_dir 'data/bert-base-uncased_prepared_data' \
--task_file 'sample_task_file.yml' \
--out_dir 'sample_out' \
--epochs 5 \
--train_batch_size 4 \
--eval_batch_size 8 \
--grad_accumulation_steps 2 \
--log_per_updates 25 \
--save_per_updates 1000 \
--eval_while_train True \
--test_while_train True \
--max_seq_len 50 \
--silent True
Logs and tensorboard
--------------------

- Logs for the training should be saved in a time-stamp named directory (eg. 05_05-17_30).
- The tensorboard logs are also present in the same directory and tensorboard can be started with the following command

>>> tensorboard --logdir 05_05-17_30/tb_logs
.. code-block:: console
$ tensorboard --logdir 05_05-17_30/tb_logs

0 comments on commit 991b649

Please sign in to comment.