Multiple Metrics in Tensorboard Hparam at checkpoint moment

Hello,
I want to be able to see other metrics in the Hparam tab of tensorboard. I know that I can get the best score by using checkpoint_callback.best_model_score after training. how do I get other metrics at the best epoch?

as a suggestion it could be useful to connect the checkpoint callback to the logger callback somehow so so when saving the checkpoint moment the checkpoint callback adds previously specified metrics to the logger using for example logger.log_hyparparams function.

You can write custom callback that write metric (or metrics) that you want at the epoch_end with logger.log_hyparparams if this metric better than on previous epoch.

you can override on_save_checkpoint and attach and logged metrics to all the checkpoints that are being saved.

def on_save_checkpoint(self, checkpoint):
    checkpoint['ckpt_metrics'] = self.trainer.logged_metrics

And after training access them using torch.load(checkpoint_callback.best_model_path)['ckpt_metrics']

1 Like