Shortcuts

Tuner

class pytorch_lightning.tuner.tuning.Tuner(trainer)[source]

Bases: object

Tuner class to tune your model.

lr_find(model, train_dataloaders=None, val_dataloaders=None, datamodule=None, min_lr=1e-08, max_lr=1, num_training=100, mode='exponential', early_stop_threshold=4.0, update_attr=False)[source]

Enables the user to do a range test of good initial learning rates, to reduce the amount of guesswork in picking a good starting learning rate.

Parameters:
Raises:

MisconfigurationException – If learning rate/lr in model or model.hparams isn’t overridden when auto_lr_find=True, or if you are using more than one optimizer.

Return type:

Optional[_LRFinder]

scale_batch_size(model, train_dataloaders=None, val_dataloaders=None, datamodule=None, mode='power', steps_per_trial=3, init_val=2, max_trials=25, batch_arg_name='batch_size')[source]

Iteratively try to find the largest batch size for a given model that does not give an out of memory (OOM) error.

Parameters:
Return type:

Optional[int]