Skip to content
forked from HadoopIt/rnn-nlu

A TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling

Notifications You must be signed in to change notification settings

oceanos74/rnn-nlu

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

981b9f2 · May 6, 2017

History

8 Commits
Jul 5, 2016
Sep 15, 2016
Jul 5, 2016
Jul 5, 2016
Nov 2, 2016
Jul 5, 2016
Jul 5, 2016
Jul 5, 2016
Apr 20, 2017

Repository files navigation

Attention-based RNN model for Spoken Language Understanding (Intent Detection & Slot Filling)

Tensorflow implementation of attention-based LSTM models for sequence classification and sequence labeling.

Setup

Usage:

data_dir=data/ATIS_samples
model_dir=model_tmp
max_sequence_length=50  # max length for train/valid/test sequence
task=joint  # available options: intent; tagging; joint
bidirectional_rnn=True  # available options: True; False

python run_multi-task_rnn.py --data_dir $data_dir \
      --train_dir   $model_dir\
      --max_sequence_length $max_sequence_length \
      --task $task \
      --bidirectional_rnn $bidirectional_rnn

Reference

  • Bing Liu, Ian Lane, "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling", Interspeech, 2016 (PDF)
@inproceedings{Liu+2016,
author={Bing Liu and Ian Lane},
title={Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling},
year=2016,
booktitle={Interspeech 2016},
doi={10.21437/Interspeech.2016-1352},
url={http://dx.doi.org/10.21437/Interspeech.2016-1352},
pages={685--689}
}

Contact

Feel free to email [email protected] for any pertinent questions/bugs regarding the code.

About

A TensorFlow implementation of Recurrent Neural Networks for Sequence Classification and Sequence Labeling

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 84.1%
  • Perl 15.9%