Multi-GPU training issue - DDP strategy. Training hangs upon distributed GPU initialisation

@soumickmj Glad you found this out!
Yes, this is a common mistake, and on the latest version of Lightning we show a warning that the command is not correct, and suggest srun to be used.