Rank_zero_only Callback in ddp

Hi all,
I use a Callback for logging stuff along my training, and when using multi-GPU I’d like to log only things for rank 0. Should I just put a @rank_zero_only decorator above all methods of my callback or is there a better way to specify directly that this callback should be active only on rank 0?

Thanks a lot!

Hi @aRI0U1

If you are using the logger from Lightning inside your callback, you don’t need to do anything.
Both

trainer.logger.experiment.anything()

and

pl_module.log(...)

will just log on rank 0 only.
If by logging you mean printing to a log file or similar, then you should add the decorator if you want, or check like this:

if trainer.global_rank == 0:
    # do something only on rank 0

Both ways are equivalent.
Hope this helps :slight_smile:

Adrian

Yeah, I was actually logging something fancier than just metrics which involved quite a lot of additional computing, hence my will to compute it only on one GPU.

Your answer is super clear, thanks a lot!

1 Like