Skip to content

Commit

Permalink
Fix embedding initialization issue asyml#47 (asyml#48)
Browse files Browse the repository at this point in the history
Fix embedding initialization issue asyml#47
  • Loading branch information
huzecong authored Jun 18, 2019
2 parents f7c79ad + 371b5ce commit 05a5f61
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions texar/modules/embedders/embedder_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -148,9 +148,9 @@ def get_embedding(num_embeds: Optional[int] = None,
embedding = torch.empty(size=[num_embeds] + dim)
# initializer should be set by layers.get_initializer
if initializer:
embedding = initializer(embedding)
initializer(embedding)
else:
embedding = torch.nn.init.xavier_uniform_(embedding)
torch.nn.init.xavier_uniform_(embedding)
else:
if torch.is_tensor(init_value):
embedding = init_value # Do not copy the tensor.
Expand Down

0 comments on commit 05a5f61

Please sign in to comment.