Skip to content

Commit 44a8e62

Browse files
committed
bump pytorch dep to and docker image to 2.8.0-rc3
1 parent 346a86a commit 44a8e62

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

tests/test_model_parallel.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -516,10 +516,11 @@ def __post_init__(self):
516516
ModParallelTestCfg(model_cfg_key="tp_lp_fp16", model_cls=cm_mod_parallel, precision_opts=fp16, model_cfg=tp_only,
517517
strategy_cfg=dp1_tp2, runif_alias="alone",
518518
expected_results=ExpectedResults(expected_state=path_tp)),
519-
# TODO: temporarly disabling until issue addressed in upstream RC
519+
# TODO: given that SDPBackend.MATH with lp causes (at least without special accommodation)
520+
# https://github.com/pytorch/pytorch/pull/149764 as of PT 2.8, we are disabling this test until our
521+
# infrastructure is upgraded and we can use SDPBackend.FLASH with bf16 on all our cards
520522
# ModParallelTestCfg(model_cfg_key="tp_lp_bf16", model_cls=cm_mod_parallel, precision_opts=bf16,
521523
# model_cfg=tp_lp_math_sdp_impl, strategy_cfg=dp1_tp2, runif_alias="bf16_alone"),
522-
523524
# FSDP2 + TP (trivial submesh) tests
524525
# temporarily disabling this test as it triggers a hang in `fsdp_autocm_tp` about 10% of the time and
525526
# `fsdp_autocm_tp` and the marginal utility of `fsdp_tp` is minimal

0 commit comments

Comments
 (0)