(pytorch-lightning 1.8.1) Load_from_checkpoint: checkpoint[ 'module_arguments'] KeyError

Hi,

I would like to obtain hparams from the checkpoint but 'module_arguments' was not found in the checkpoint. I explicitly called self.save_hyperparameters(logger=False) in __init__.

class BetaVAE(pl.LightningModule): 
    def __init__(self,
                 in_channels: int,
                 latent_dim: int,
                 hidden_dims: List = None,
                 beta: int = 4,
                 gamma:float = 1000.,
                 max_capacity: int = 25,
                 Capacity_max_iter: int = 1e5,
                 **kwargs) -> None:
        super().__init__()
        **self.save_hyperparameters(logger=False)**

My checkpoint has dict_keys(['epoch', 'global_step', 'pytorch-lightning_version', 'state_dict', 'loops', 'callbacks', 'optimizer_states', 'lr_schedulers']) but nothing related to hparams.

After reading through all the posts online about loading hparams from ckpt, I still could not figure out where went wrong in my code.

Here are the versions of my packages:
pytorch 1.13.0
lightning-utilities 0.4.2
pytorch-lightning 1.8.1

Any suggestions or thoughts are highly appreciated!

Addressed via another post. This can be closed