Loop over epochs instead of batches

Hello,

I would like to use the Lightning trainer in a way that in loops over epochs instead of over the batches. So instead of:

def training_step(self, train_batch, batch_idx):

I want something like

def training_step(self, data_loader, epoch):

Or some way of looping over the epochs manually.

How can I achieve this?

this is not possible with LightningModule.

You can instead try LightningLite
or you can customize a custom loop.

Also, we have moved the discussions to GitHub Discussions. You might want to check that out instead to get a quick response. The forums will be marked read-only after some time.

Thank you