Skip to content

Commit

Permalink
Avoid crash when linear activation does not have alpha and beta defin…
Browse files Browse the repository at this point in the history
…ed (dmlc#306)
  • Loading branch information
thefiddler authored and tqchen committed Jan 15, 2018
1 parent a0a3a1d commit 00a88d4
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion python/nnvm/frontend/keras.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,10 @@ def _convert_activation(insym, keras_layer, _):
if act_type == 'linear':
if isinstance(keras_layer, str):
return insym
alpha = keras_layer.alpha if hasattr(keras_layer, "alpha") else 1
beta = keras_layer.beta if hasattr(keras_layer, "beta") else 0
return _sym.__add_scalar__(_sym.__mul_scalar__(insym, \
scalar=keras_layer.alpha), scalar=keras_layer.beta)
scalar=alpha), scalar=beta)
elif act_type == 'softmax':
return _sym.softmax(insym)
elif act_type == 'sigmoid':
Expand Down

0 comments on commit 00a88d4

Please sign in to comment.