forked from pytorch/pytorch
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Allow FSDP to have ignored modules out of wrapped root (pytorch#91079)
Motivations for this change: 1. TorchRec returns inconsistent results on `m.named_parameters()` and `m.m1.named_parameters()` if m1 is a `ShardedModule`. Basically, `ShardedModule` appears in `m.named_modules()`, but its parameters are not in `m.named_parameters()`. As a result, when we identify `ShardedModule` and pass them as `ignored_modules` to FSDP, FSDP complains about key error in `_get_ignored_params`. 2. If users are manually wrapping submodules with FSDP, it could be easier for them to keep a global set of ignored parameters, instead of create a new collection for every FSDP invocation. Given the above two reasons, we allow FSDP to have ignored modules out of the wrapped root module. Differential Revision: [D42132394](https://our.internmc.facebook.com/intern/diff/D42132394) Pull Request resolved: pytorch#91079 Approved by: https://github.com/awgu
- Loading branch information
1 parent
6686e9b
commit e5a48da
Showing
2 changed files
with
45 additions
and
8 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters