Skip to content

Commit

Permalink
Remove unneeded TODO (pytorch#80453)
Browse files Browse the repository at this point in the history
This TODO is no longer needed, as we use `_register_fused_optim` to register the overlapped optimizer in DDP.  Also, remove comment about API being experimental, as this API is no longer going to be used by end user.
Pull Request resolved: pytorch#80453
Approved by: https://github.com/awgu
  • Loading branch information
rohan-varma authored and pytorchmergebot committed Jun 29, 2022
1 parent 7e34edf commit 5fc2d45
Showing 1 changed file with 0 additions and 4 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -30,16 +30,12 @@ def _check_valid_functional_optim(self):
)


# TODO: Add an example to use such a wrapper.
def _hook_then_optimizer(
hook: Callable[[Any, dist.GradBucket], torch.futures.Future[torch.Tensor]],
optimizer_state: _OptimizerHookState,
) -> Callable[[Any, dist.GradBucket], torch.futures.Future[torch.Tensor]]:
r"""
Runs optimizer in a functional fashion after DDP communication hook.
.. warning ::
This API is experimental adn subject to change.
"""
has_set_params = (
hasattr(optimizer_state, 'params_to_optimize')
Expand Down

0 comments on commit 5fc2d45

Please sign in to comment.