FitLoop
class pytorch_lightning.loops. FitLoop ( min_epochs = 0 , max_epochs = 1000 ) [source]
Bases: pytorch_lightning.loops.base.Loop
[None
]
This Loop iterates over the epochs to run the training.
Parameters
min_epochs (int
) – The minimum number of epochs
max_epochs (int
) – The maximum number of epochs, can be set -1 to turn this limit off
advance ( ) [source]
Runs one whole epoch.
Return type
None
connect ( epoch_loop ) [source]
Connects a training epoch loop to this fit loop.
Return type
None
on_advance_end ( ) [source]
Hook to be called each time after advance
is called.
Return type
None
on_advance_start ( ) [source]
Prepares the dataloader for training and calls the hooks on_epoch_start
and
on_train_epoch_start
Return type
None
on_run_end ( ) [source]
Calls the on_train_end
hook.
Return type
None
on_run_start ( ) [source]
Calls the on_train_start
hook.
Return type
None
reset ( ) [source]
Resets the internal state of this loop.
Return type
None
teardown ( ) [source]
Use to release memory etc.
Return type
None
property batch_idx : int
Returns the current batch index (within this epoch)
Return type
int
property done : bool
Evaluates when to leave the loop.
Return type
bool
property max_steps : int
Returns the maximum number of steps to run.
Return type
int
property min_steps : Optional [ int ]
Returns the minimum number of steps to run.
Return type
Optional
[int
]
property restarting : bool
Whether the state of this loop was reloaded and it needs to restart.
Return type
bool
property running_loss : pytorch_lightning.trainer.supporters.TensorRunningAccum
Returns the running loss.
Return type
TensorRunningAccum
property skip : bool
Whether we should skip the training and immediately return from the call to run()
.
Return type
bool
property split_idx : int
Returns the index of the current batch split (within the current batch) for bptt.
Return type
int
property total_batch_idx : int
Returns the current batch index (across epochs)
Return type
int
To analyze traffic and optimize your experience, we serve cookies on this site. By clicking or navigating, you agree to allow our usage of cookies. Read PyTorch Lightning's Privacy Policy .