Skip to content

Commit

Permalink
untie embedding, after learning T5 switched to untied classifier weig…
Browse files Browse the repository at this point in the history
…hts in latest version, aligning with my experience that it is better
  • Loading branch information
lucidrains committed Jan 4, 2021
1 parent 399efcb commit 058bc89
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
setup(
name = 'x-transformers',
packages = find_packages(exclude=['examples']),
version = '0.6.7',
version = '0.7.0',
license='MIT',
description = 'X-Transformers - Pytorch',
author = 'Phil Wang',
Expand Down
2 changes: 1 addition & 1 deletion x_transformers/x_transformers.py
Original file line number Diff line number Diff line change
Expand Up @@ -585,7 +585,7 @@ def __init__(
max_mem_len = 0.,
emb_dropout = 0.,
num_memory_tokens = None,
tie_embedding = True,
tie_embedding = False,
use_pos_emb = True
):
super().__init__()
Expand Down

0 comments on commit 058bc89

Please sign in to comment.