How to get the checkpoint without saving it?

When I train a LightningModule using a Trainer, how do I get the checkpoint object (which is presumably a python dict) without saving it to disk?

There is currently no mechanism for this to get the full checkpoint dict without saving it to disk. However, you can access the full dict before it is saved by overriding the hook on the LightningModule (or callback):

class MyModel(LightningModule):
    def on_save_checkpoint(self, checkpoint):

An alternative: If you don’t care about the full state of trainer etc., included in the checkpoint, and only care about the model weights, then you can access model.state_dict() with all parameters inside (works for most strategies except deepspeed and fsdp).

To give a better answer, we would probably have to get a few more infos about your usecase.