Skip to content

Conversation

@HosseinKaviani-H
Copy link
Contributor

The _init_dist() method was setting up PyTorch distributed environment
variables manually. This is no longer needed because the provisioner's
get_proc_mesh() method now properly calls setup_env_for_distributed()
from Monarch, which handles all distributed environment setup. This simplifies the code and removes redundant initialization.

Hossein Kavianihamedani added 3 commits November 12, 2025 10:35
The _init_dist() method was setting up PyTorch distributed environment
variables manually. This is no longer needed because the provisioner's
get_proc_mesh() method now properly calls setup_env_for_distributed()
from Monarch, which handles all distributed environment setup.

This simplifies the code and removes redundant initialization.
Previously the method was just commented out. This commit fully removes
it since the provisioner now handles all distributed environment setup
via setup_env_for_distributed(). This makes the code cleaner and easier
to maintain.
@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Nov 12, 2025
@HosseinKaviani-H HosseinKaviani-H merged commit 6a0687c into meta-pytorch:main Nov 12, 2025
10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants