rank_zero only logic #19572
Unanswered
itzsimpl
asked this question in
DDP / multi-GPU / multi-node
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
In line
pytorch-lightning/src/lightning/fabric/utilities/rank_zero.py
Line 42 in b3c869f
LOCAL_RANK
intentional? Will this not lead to different behaviour depending on the environment?RANK
,SLURM_PROCID
andJSM_NAMESPACE_RANK
are global rank,LOCAL_RANK
is node local.In a multi-node setup, assuming
LOCAL_RANK
is set then depending onRANK
being set or not this may lead torank_zero_only
execute on global rank 0 or local rank 0.Beta Was this translation helpful? Give feedback.
All reactions