SaveConfigCallback¶
- class lightning.pytorch.cli.SaveConfigCallback(parser, config, config_filename='config.yaml', overwrite=False, multifile=False, save_to_log_dir=True)[source]¶
Bases:
Callback
Saves a LightningCLI config to the log_dir when training starts.
- Parameters:
parser¶ (
LightningArgumentParser
) – The parser object used to parse the configuration.config¶ (
Namespace
) – The parsed configuration that will be saved.overwrite¶ (
bool
) – Whether to overwrite an existing config file.multifile¶ (
bool
) – When input is multiple config files, saved config preserves this structure.save_to_log_dir¶ (
bool
) – Whether to save the config to the log_dir.
- Raises:
RuntimeError – If the config file already exists in the directory to avoid overwriting a previous run
- save_config(trainer, pl_module, stage)[source]¶
Implement to save the config in some other place additional to the standard log_dir.
- Return type:
Example
- def save_config(self, trainer, pl_module, stage):
- if isinstance(trainer.logger, Logger):
config = self.parser.dump(self.config, skip_none=False) # Required for proper reproducibility trainer.logger.log_hyperparams({“config”: config})
Note
This method is only called on rank zero. This allows to implement a custom save config without having to worry about ranks or race conditions. Since it only runs on rank zero, any collective call will make the process hang waiting for a broadcast. If you need to make collective calls, implement the setup method instead.