Skip to content

Commit

Permalink
Added best LM model on wikitext 103
Browse files Browse the repository at this point in the history
  • Loading branch information
PiotrCzapla committed Sep 17, 2018
1 parent 26e720c commit 7a39ec3
Show file tree
Hide file tree
Showing 2 changed files with 54 additions and 0 deletions.
46 changes: 46 additions & 0 deletions _data/language_modeling.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,52 @@ Word_Level:
Validation perplexity: 68.6
Test perplexity: 65.8

WikiText_103:
- &Rae2018
model: LSTM + Hebbian + Cache + MbPA
authors: Rae et al.
year: 2018
Validation perplexity: 29.0
Test perplexity: 29.2
paper: Fast Parametric Learning with Activation Memorization
url: http://arxiv.org/abs/1803.10049
code: []
- <<: *Rae2018
model: LSTM + Hebbian
Validation perplexity: 34.1
Test perplexity: 34.3
- <<: *Rae2018
model: LSTM
Validation perplexity: 36.0
Test perplexity: 36.4
- &Dauphin2016
model: Gated CNN
authors: Dauphin et al.
year: 2016
Validation perplexity: -
Test perplexity: 37.2
paper: Language modeling with gated convolutional networks
url: https://arxiv.org/abs/1612.08083
code: []
- &Bai2018
model: Temporal CNN
authors: Bai et al.
year: 2018
Validation perplexity: -
Test perplexity: 45.2
paper: Convolutional sequence modeling revisited
url: https://openreview.net/forum?id=rk8wKk-R-.
code: []
- &Graves2014
model: LSTM
authors: Graves et al.
year: 2014
Validation perplexity: -
Test perplexity: 48.7
paper: Neural turing machines
url: https://arxiv.org/abs/1410.5401
code: []

Char_Level:
Hutter_Prize:
- &Al-Rfou2018_T64
Expand Down
8 changes: 8 additions & 0 deletions language_modeling.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,14 @@ consists of around 2 million words extracted from Wikipedia articles.
results=site.data.language_modeling.Word_Level.WikiText_2
scores='Validation perplexity,Test perplexity' %}

### WikiText-103

[WikiText-103](https://arxiv.org/abs/1609.07843) The WikiText-103 corpus contains 267,735 unique words and each word occurs at least three times in the training set.

{% include table.html
results=site.data.language_modeling.Word_Level.WikiText_103
scores='Validation perplexity,Test perplexity' %}

## Character Level Models

### Hutter Prize
Expand Down

0 comments on commit 7a39ec3

Please sign in to comment.