Checkpointing and Restoring

I have a question, I’m using a LightningModule “A” for a pretraining in which I use a model “A” and some specific losses only for the pretraining.
.
To save the weights I am using WandbLogger together with the ModelCheckpoint and looking at the stored results I have something like this:

/save_dir/project_name/run_name/checkpoints/file.ckpt/checkpoint/{model.pt, optimize.pt}

In a second moment I will have to use a second LightningModule “B” that inherits from class “A” and uses the same model “A” with some additional loss and different training structure.
I would like to load the weights only for model “A”, how can I do it? And how can I save only the models weights without saving all the other unnecessary weights as they don’t require grad?

Secondly why is my “last.ckpt” a folder?