FitLoop
class pytorch_lightning.loops. FitLoop ( min_epochs = 1 , max_epochs = 1000 ) [source]
Bases: abc.ABC
, Generic
[pytorch_lightning.loops.base.T
]
This Loop iterates over the epochs to run the training.
Parameters
min_epochs (Optional
[int
]) – The minimum number of epochs
max_epochs (int
) – The maximum number of epochs, can be set -1 to turn this limit off
advance ( ) [source]
Runs one whole epoch.
Return type
None
connect ( epoch_loop ) [source]
Connects a training epoch loop to this fit loop.
on_advance_end ( ) [source]
Hook to be called each time after advance
is called.
Return type
None
on_advance_start ( ) [source]
Prepares the dataloader for training and calls the hooks on_epoch_start
and
on_train_epoch_start
Return type
None
on_run_end ( ) [source]
Calls the on_train_end
hook.
Return type
None
on_run_start ( ) [source]
Calls the on_train_start
hook.
Return type
None
reset ( ) [source]
Resets the internal state of this loop.
Return type
None
teardown ( ) [source]
Use to release memory etc.
Return type
None
property batch_idx : int
Returns the current batch index (within this epoch)
property current_epoch : int
Return the current epoch.
property done : bool
Evaluates when to leave the loop.
Returns True if trainer.should_stop was set (e.g. by early stopping) or if the maximum number of steps or epochs
is reached.
property global_step : int
Returns the global step.
property max_steps : int
Returns the maximum number of steps to run.
property min_steps : int
Returns the minimum numnber of steps to run.
property running_loss : pytorch_lightning.trainer.supporters.TensorRunningAccum
Returns the running loss.
property skip : bool
Whether we should skip the training and immediately return from the call to run()
.
property split_idx : int
Returns the index of the current batch split (within the current batch) for bptt.
property total_batch_idx : int
Returns the current batch index (across epochs)
To analyze traffic and optimize your experience, we serve cookies on this site. By clicking or navigating, you agree to allow our usage of cookies. Read PyTorch Lightning's Privacy Policy .