Skip to content
This repository has been archived by the owner on Oct 23, 2024. It is now read-only.

Add StokeScheduler #20

Open
zaksemenov opened this issue Oct 13, 2021 · 1 comment
Open

Add StokeScheduler #20

zaksemenov opened this issue Oct 13, 2021 · 1 comment
Assignees
Labels
enhancement New feature or request

Comments

@zaksemenov
Copy link

Feature

Add a class that wraps the torch.scheduler in the same way that the StokeOptimizer wraps the StokeScheduler

Motivation

Although a scheduler can be added and work in tandem with the StokeOptimizer, the stoke_obj.step() method seems to be wrapping a few other things, so it would add to code cleanliness for the end user if the scheduler.step was also encapsulated in the stoke_obj.step method

Proposal

Extend the API to wrap a torch.scheduler instance and call stoke_scheduler.step in the stoke_obj.step method (if instantiated)

@ncilfone
Copy link
Contributor

ncilfone commented Nov 9, 2021

Current way to handle LR Scheduler:

stoke_optimizer= StokeOptimizer(
     optimizer = AdamW,
     optimizer_kwargs = {
         "lr" : 1e-3,          
         "betas" : (0.9, 0.99),
         "eps" : 1e-8,
         "weight_decay" : 1e-4        
     }
    
 )


stoke_model = Stoke(model, stoke_optimizer.......)

scheduler = optim.lr_scheduler.OneCycleLR(stoke_model.optimizer, max_lr=0.001, pct_start = 0.9, steps_per_epoch=len(train_dataloader), epochs=epochs)


train():
     .......
      ### PyTorch 1.10 -- they changed the order required
      stoke_model.step()
      scheduler.step() 
     
     ### PyTorch < 1.10
     ......
     scheduler.step()
     stoke_model.step()

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants