Skip to content

Commit

Permalink
fix Embedding config property to include new input_length property, a…
Browse files Browse the repository at this point in the history
…dd input_length to all examples with Embedding layer
  • Loading branch information
transcranial committed Oct 9, 2015
1 parent 9dbf04b commit 4c1a6fc
Show file tree
Hide file tree
Showing 5 changed files with 6 additions and 6 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ from keras.layers.embeddings import Embedding
from keras.layers.recurrent import LSTM

model = Sequential()
model.add(Embedding(max_features, 256))
model.add(Embedding(max_features, 256, input_length=maxlen))
model.add(LSTM(output_dim=128, activation='sigmoid', inner_activation='hard_sigmoid'))
model.add(Dropout(0.5))
model.add(Dense(1))
Expand Down
2 changes: 1 addition & 1 deletion docs/sources/examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ from keras.layers.embeddings import Embedding
from keras.layers.recurrent import LSTM

model = Sequential()
model.add(Embedding(max_features, 256))
model.add(Embedding(max_features, 256, input_length=maxlen))
model.add(LSTM(output_dim=128, activation='sigmoid', inner_activation='hard_sigmoid'))
model.add(Dropout(0.5))
model.add(Dense(1))
Expand Down
4 changes: 2 additions & 2 deletions docs/sources/layers/embeddings.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
## Embedding

```python
keras.layers.embeddings.Embedding(input_dim, output_dim, init='uniform', weights=None, W_regularizer=None, W_constraint=None, mask_zero=False, max_length=None)
keras.layers.embeddings.Embedding(input_dim, output_dim, init='uniform', input_length=None, weights=None, W_regularizer=None, W_constraint=None, mask_zero=False)
```

Turn positive integers (indexes) into denses vectors of fixed size,
Expand All @@ -27,7 +27,7 @@ eg. `[[4], [20]] -> [[0.25, 0.1], [0.6, -0.2]]`
## WordContextProduct

```python
keras.layers.embeddings.WordContextProduct(input_dim, proj_dim=128,
keras.layers.embeddings.WordContextProduct(input_dim, proj_dim=128,
init='uniform', activation='sigmoid', weights=None)
```

Expand Down
2 changes: 1 addition & 1 deletion examples/imdb_lstm.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@

print('Build model...')
model = Sequential()
model.add(Embedding(max_features, 128))
model.add(Embedding(max_features, 128, input_length=maxlen))
model.add(LSTM(128)) # try using a GRU instead, for fun
model.add(Dropout(0.5))
model.add(Dense(1))
Expand Down
2 changes: 1 addition & 1 deletion keras/layers/embeddings.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ def get_config(self):
"input_dim": self.input_dim,
"output_dim": self.output_dim,
"init": self.init.__name__,
"max_lenght": self.max_lenght,
"input_length": self.input_length,
"mask_zero": self.mask_zero,
"activity_regularizer": self.activity_regularizer.get_config() if self.activity_regularizer else None,
"W_regularizer": self.W_regularizer.get_config() if self.W_regularizer else None,
Expand Down

0 comments on commit 4c1a6fc

Please sign in to comment.