Skip to content

Commit

Permalink
Add nauka.utils.torch.optim.setLR() utility.
Browse files Browse the repository at this point in the history
Useful to set optimizer LRs without having to rely on
torch.optim.lr_scheduler-derived objects.
  • Loading branch information
obilaniu committed Jul 26, 2018
1 parent 2a52df9 commit fab48f6
Showing 1 changed file with 11 additions and 0 deletions.
11 changes: 11 additions & 0 deletions src/nauka/utils/torch/optim/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,3 +18,14 @@ def fromSpec(params, spec, **kwargs):
**kwargs)
else:
raise NotImplementedError("Optimizer "+spec.name+" not implemented!")

def setLR(optimizer, lr):
if isinstance(lr, dict):
for paramGroup in optimizer.param_groups:
paramGroup["lr"] = lr[paramGroup]
elif isinstance(lr, list):
for paramGroup, lritem in zip(optimizer.param_groups, lr):
paramGroup["lr"] = lritem
else:
for paramGroup in optimizer.param_groups:
paramGroup["lr"] = lr

0 comments on commit fab48f6

Please sign in to comment.