Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stop wrapping optimizers to simplify use fo torch optimizers #18

Open
havakv opened this issue Sep 7, 2020 · 0 comments
Open

Stop wrapping optimizers to simplify use fo torch optimizers #18

havakv opened this issue Sep 7, 2020 · 0 comments
Labels
enhancement New feature or request

Comments

@havakv
Copy link
Owner

havakv commented Sep 7, 2020

Right now all optimizers are wrapped, so to access a torch optimizer object we need to call model.optimizer.optimizer. It might makes sense to be able to get the torch optimizer with model.optimizer.

If we continue to wrap torch optimizers, maybe make it a wrapper objects model.optimizer_wrapper.

@havakv havakv added the enhancement New feature or request label Sep 7, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant