Skip to content

Commit bbada81

Browse files
authored
Revert "Move model to device before wrapping with FSDP (#1801)" (#1865)
1 parent f3124e7 commit bbada81

File tree

1 file changed

+0
-3
lines changed

1 file changed

+0
-3
lines changed

optimum/habana/accelerate/accelerator.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -476,9 +476,6 @@ def prepare_model(self, model: torch.nn.Module, device_placement: bool = None, e
476476
"limit_all_gathers": fsdp_plugin.limit_all_gathers,
477477
"device_id": torch.device("hpu", torch.hpu.current_device()),
478478
}
479-
# There's issue with moving view tensors to device within FSDP class [See: https://github.com/pytorch/pytorch/issues/147321]
480-
# Due to above issue, view tensor's may lead to silent incorrent behavior, while pretending to be view they're really not
481-
model = model.to(kwargs["device_id"])
482479
model = FSDP(model, **kwargs)
483480
if fsdp_plugin.activation_checkpointing:
484481
from torch.distributed.algorithms._checkpoint.checkpoint_wrapper import (

0 commit comments

Comments
 (0)