Shortcuts

FitLoop

class pytorch_lightning.loops.FitLoop(min_epochs=0, max_epochs=None)[source]

Bases: pytorch_lightning.loops.loop.Loop[None]

This Loop iterates over the epochs to run the training.

Parameters:
  • min_epochs (Optional[int]) – The minimum number of epochs

  • max_epochs (Optional[int]) – The maximum number of epochs, can be set -1 to turn this limit off

advance()[source]

Runs one whole epoch.

Return type:

None

connect(epoch_loop)[source]

Connects a training epoch loop to this fit loop.

Return type:

None

on_advance_end()[source]

Hook to be called each time after advance is called.

Return type:

None

on_advance_start()[source]

Prepares the dataloader for training and calls the hook on_train_epoch_start

Return type:

None

on_run_end()[source]

Calls the on_train_end hook.

Return type:

None

on_run_start()[source]

Calls the on_train_start hook.

Return type:

None

reset()[source]

Resets the internal state of this loop.

Return type:

None

teardown()[source]

Use to release memory etc.

Return type:

None

property batch_idx: int

Returns the current batch index (within this epoch)

property done: bool

Evaluates when to leave the loop.

property max_steps: int

Returns the maximum number of steps to run.

property min_steps: Optional[int]

Returns the minimum number of steps to run.

property restarting: bool

Whether the state of this loop was reloaded and it needs to restart.

property running_loss: pytorch_lightning.trainer.supporters.TensorRunningAccum

Returns the running loss.

property skip: bool

Whether we should skip the training and immediately return from the call to run().

property split_idx: int

Returns the index of the current batch split (within the current batch) for bptt.

property total_batch_idx: int

Returns the current batch index (across epochs)