Repeated augmented sampler for lightning (DDP multi-gpu case)

Is there a multi-gpu (DDP) repeated augmented sampler for lightning? I tried using deit/samplers.py at main but it becomes a bottleneck and thus not distributable. Thank you for your help!!

Hi, I don’t get, why the one from deit shouldn’t be distributable as they explicitly mention this as a usecase. We don’t have anything domain or data specific as lightning tries to be domain-agnostic. We use the general torch.utils.data.DistributedSampler the deit sampler is based on as well, but that one does not consider augmentations.