OptimizerLoop¶
- class pytorch_lightning.loops.optimization.OptimizerLoop[source]¶
Bases:
pytorch_lightning.loops.base.Loop
[Dict
[int
,Dict
[str
,Any
]]]Runs over a sequence of optimizers.
This loop implements what is known in Lightning as Automatic Optimization.
- output_result_cls¶
alias of
pytorch_lightning.loops.optimization.optimizer_loop.ClosureResult
- advance(batch, *args, **kwargs)[source]¶
Performs a single step.
Accepts all arguments passed to
run
.Example:
def advance(self, iterator): batch = next(iterator) loss = self.trainer.lightning_module.training_step(batch, batch_idx) ...
- Return type
- connect(**kwargs)[source]¶
Optionally connect one or multiple loops to this one.
Linked loops should form a tree.
- Return type
- on_run_end()[source]¶
Hook to be called at the end of the run.
Its return argument is returned from
run
.
- on_run_start(batch, optimizers, batch_idx)[source]¶
Hook to be called as the first thing after entering
run
(except the state reset).Accepts all arguments passed to
run
.- Return type
- reset()[source]¶
Resets the internal state of the loop at the beginning of each call to
run
.Example:
def reset(self): # reset your internal state or add custom logic # if you expect run() to be called multiple times self.current_iteration = 0 self.outputs = []
- Return type
- property done: bool¶
Returns
True
when the last optimizer in the sequence has run.