Skip to content

Commit

Permalink
Follow FSDP name change (#1097)
Browse files Browse the repository at this point in the history
`FSDP` -> `FSDPModule`
  • Loading branch information
kwen2501 authored Apr 30, 2024
1 parent dcfc617 commit 5ac407b
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions pippy/PipelineStage.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
import torch.distributed as dist
import torch.fx as fx
from torch._subclasses.fake_tensor import FakeTensor
from torch.distributed._composable.fsdp.fully_shard import FSDP
from torch.distributed._composable.fsdp.fully_shard import FSDPModule
from torch.fx.node import map_aggregate
from torch.nn.parallel import DistributedDataParallel

Expand Down Expand Up @@ -367,7 +367,7 @@ def _configure_data_parallel_mode(self, last_backward: bool):
there are additional state-variables and performance considerations depending on the data parallelism used.
This helper should adapt any pipeline parallel schedule to work with common/supported data parallel libraries.
"""
if isinstance(self.submod, FSDP):
if isinstance(self.submod, FSDPModule):
self.submod.set_is_last_backward(last_backward)
self.submod.set_requires_gradient_sync(last_backward)

Expand Down

0 comments on commit 5ac407b

Please sign in to comment.