cli¶
Functions
Instantiates a class with the given args and init. |
Classes
Extension of jsonargparse’s ArgumentParser for pytorch-lightning |
|
Implementation of a configurable command line tool for pytorch-lightning |
|
Saves a LightningCLI config to the log_dir when training starts |
- class pytorch_lightning.utilities.cli.LightningArgumentParser(*args, parse_as_dict=True, **kwargs)[source]¶
Bases:
jsonargparse.
Extension of jsonargparse’s ArgumentParser for pytorch-lightning
Initialize argument parser that supports configuration file input
For full details of accepted arguments see ArgumentParser.__init__.
- add_lightning_class_args(lightning_class, nested_key, subclass_mode=False)[source]¶
Adds arguments from a lightning class to a nested key of the parser
- Parameters
lightning_class¶ (
Union
[Callable
[...
,Union
[Trainer
,LightningModule
,LightningDataModule
,Callback
]],Type
[Trainer
],Type
[LightningModule
],Type
[LightningDataModule
],Type
[Callback
]]) – A callable or any subclass of {Trainer, LightningModule, LightningDataModule, Callback}.nested_key¶ (
str
) – Name of the nested namespace to store arguments.subclass_mode¶ (
bool
) – Whether allow any subclass of the given class.
- Return type
- add_lr_scheduler_args(lr_scheduler_class, nested_key='lr_scheduler', link_to='AUTOMATIC')[source]¶
Adds arguments from a learning rate scheduler class to a nested key of the parser
- Parameters
lr_scheduler_class¶ (
Union
[Type
[_LRScheduler
],Type
[ReduceLROnPlateau
],Tuple
[Union
[Type
[_LRScheduler
],Type
[ReduceLROnPlateau
]],...
]]) – Any subclass oftorch.optim.lr_scheduler.{_LRScheduler, ReduceLROnPlateau}
.nested_key¶ (
str
) – Name of the nested namespace to store arguments.link_to¶ (
str
) – Dot notation of a parser key to set arguments or AUTOMATIC.
- Return type
- class pytorch_lightning.utilities.cli.LightningCLI(model_class, datamodule_class=None, save_config_callback=<class 'pytorch_lightning.utilities.cli.SaveConfigCallback'>, save_config_filename='config.yaml', save_config_overwrite=False, trainer_class=<class 'pytorch_lightning.trainer.trainer.Trainer'>, trainer_defaults=None, seed_everything_default=None, description='pytorch-lightning trainer command line tool', env_prefix='PL', env_parse=False, parser_kwargs=None, subclass_mode_model=False, subclass_mode_data=False)[source]¶
Bases:
object
Implementation of a configurable command line tool for pytorch-lightning
Receives as input pytorch-lightning classes (or callables which return pytorch-lightning classes), which are called / instantiated using a parsed configuration file and / or command line args and then runs trainer.fit. Parsing of configuration from environment variables can be enabled by setting
env_parse=True
. A full configuration yaml would be parsed fromPL_CONFIG
if set. Individual settings are so parsed from variables named for examplePL_TRAINER__MAX_EPOCHS
.Example, first implement the
trainer.py
tool as:from mymodels import MyModel from pytorch_lightning.utilities.cli import LightningCLI LightningCLI(MyModel)
Then in a shell, run the tool with the desired configuration:
$ python trainer.py --print_config > config.yaml $ nano config.yaml # modify the config as desired $ python trainer.py --cfg config.yaml
Warning
LightningCLI
is in beta and subject to change.- Parameters
model_class¶ (
Union
[Type
[LightningModule
],Callable
[...
,LightningModule
]]) –LightningModule
class to train on or a callable which returns aLightningModule
instance when called.datamodule_class¶ (
Union
[Type
[LightningDataModule
],Callable
[...
,LightningDataModule
],None
]) – An optionalLightningDataModule
class or a callable which returns aLightningDataModule
instance when called.save_config_callback¶ (
Optional
[Type
[SaveConfigCallback
]]) – A callback class to save the training config.save_config_overwrite¶ (
bool
) – Whether to overwrite an existing config file.trainer_class¶ (
Union
[Type
[Trainer
],Callable
[...
,Trainer
]]) – An optional subclass of theTrainer
class or a callable which returns aTrainer
instance when called.trainer_defaults¶ (
Optional
[Dict
[str
,Any
]]) – Set to override Trainer defaults or add persistent callbacks.seed_everything_default¶ (
Optional
[int
]) – Default value for theseed_everything()
seed argument.description¶ (
str
) – Description of the tool shown when running--help
.env_parse¶ (
bool
) – Whether environment variable parsing is enabled.parser_kwargs¶ (
Optional
[Dict
[str
,Any
]]) – Additional arguments to instantiate LightningArgumentParser.subclass_mode_model¶ (
bool
) – Whether model can be any subclass of the given class.Whether datamodule can be any subclass of the given class.
- add_arguments_to_parser(parser)[source]¶
Implement to add extra arguments to parser or link arguments
- Parameters
parser¶ (
LightningArgumentParser
) – The argument parser object to which arguments can be added- Return type
- add_configure_optimizers_method_to_model()[source]¶
Adds to the model an automatically generated configure_optimizers method
If a single optimizer and optionally a scheduler argument groups are added to the parser as ‘AUTOMATIC’, then a configure_optimizers method is automatically implemented in the model class.
- Return type
- add_core_arguments_to_parser()[source]¶
Adds arguments from the core classes to the parser
- Return type
- before_instantiate_classes()[source]¶
Implement to run some code before instantiating the classes
- Return type
- fit()[source]¶
Runs fit of the instantiated trainer class and prepared fit keyword arguments
- Return type
- instantiate_trainer()[source]¶
Instantiates the trainer using self.config_init[‘trainer’]
- Return type
- link_optimizers_and_lr_schedulers()[source]¶
Creates argument links for optimizers and lr_schedulers that specified a link_to
- Return type
- class pytorch_lightning.utilities.cli.SaveConfigCallback(parser, config, config_filename, overwrite=False)[source]¶
Bases:
pytorch_lightning.callbacks.base.Callback
Saves a LightningCLI config to the log_dir when training starts
- Raises
RuntimeError – If the config file already exists in the directory to avoid overwriting a previous run