When training a model I save hparams using self.save_hyperparameters() in the model module’s init(). This works as intended.
But if I try to use the generated checkpoint for predictions using a trainer, then lightning will keep generating a useless lightning_logs directory, containing one version for each prediction run, containing only a hparams.yaml file. Is there really no way to disable this behaviour?
I tried this without results:
prediction_trainer = pl.Trainer(inference_mode=True, logger=None, default_root_dir=None)
prediction_trainer.predict(self.model, dataloaders=prediction_dataLoader, ckpt_path=self.full_ckpt_path)
I also don’t really understand the purpose of the ckpt_path parameter, as I already have to instantiate the model anyway using .load_from_checkpoint(), otherwise the function won’t accept it.