Skip to content

Commit

Permalink
Merge pull request FlagAI-Open#390 from ftgreat/master
Browse files Browse the repository at this point in the history
try to fix inf
  • Loading branch information
BAAI-OpenPlatform authored Jun 14, 2023
2 parents 76a3a16 + c214d56 commit 36c083f
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions flagai/model/layers/attentions.py
Original file line number Diff line number Diff line change
Expand Up @@ -203,6 +203,8 @@ def forward(
keys = keys.transpose(1, 2)
values = values.transpose(1, 2)
scores = torch.matmul(xq, keys.transpose(2, 3)) / math.sqrt(self.head_dim)
## for corner cases, especially in the last layer and dtype of fp16
scores = torch.clamp(scores, min=-1024., max=1024.)
if mask is not None:
scores = scores + mask # (bs, n_local_heads, slen, cache_len + slen)
scores = F.softmax(scores.float(), dim=-1).type_as(xq)
Expand Down

0 comments on commit 36c083f

Please sign in to comment.