Skip to content

Commit

Permalink
[train_unconditional] fix applying clip_grad_norm_ (huggingface#721)
Browse files Browse the repository at this point in the history
fix clip_grad_norm_
  • Loading branch information
patil-suraj authored Oct 4, 2022
1 parent 6b22192 commit 14b9754
Showing 1 changed file with 2 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,8 @@ def transforms(examples):
loss = F.mse_loss(noise_pred, noise)
accelerator.backward(loss)

accelerator.clip_grad_norm_(model.parameters(), 1.0)
if accelerator.sync_gradients:
accelerator.clip_grad_norm_(model.parameters(), 1.0)
optimizer.step()
lr_scheduler.step()
if args.use_ema:
Expand Down

0 comments on commit 14b9754

Please sign in to comment.