Skip to content

Commit

Permalink
fix alpha
Browse files Browse the repository at this point in the history
  • Loading branch information
haoransh committed Mar 2, 2019
1 parent 908825d commit f79ab7c
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion examples/transformer/config_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@

random_seed = 1234
beam_width = 5
alpha = 0.6
length_penalty = 0.6
hidden_dim = 512

emb = {
Expand Down
2 changes: 1 addition & 1 deletion examples/transformer/transformer_main.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ def main():
memory=encoder_output,
memory_sequence_length=encoder_input_length,
beam_width=beam_width,
alpha=config_model.alpha,
length_penalty=config_model.length_penalty,
start_tokens=start_tokens,
end_token=eos_token_id,
max_decoding_length=config_data.max_decoding_length,
Expand Down

0 comments on commit f79ab7c

Please sign in to comment.