I have a PyTorch Lightning DataModule instance that defines train_dataloader, val_dataloader, and test_dataloader.
Currently using a custom callback to reload the train_dataloader that will resample the data.
I saw that there is a Trainer flag called reload_dataloaders_every_epoch and soon to be reload_dataloaders_every_n_epochs.
Do these just reload the train_dataloader, or do the do all 3?