Hello,
Is there a simple way to have checkpoint on a specified metric using monitor
but no file be saved. when I set save_top_k
to zero it didn’t keep the best metric in the checkpoint_callback. in summary I want to be able to get checkpoint_callback.best_model_score
after training, but no .ckpt file be saved.
Hi
I do not recommend to use ModelCheckpoint for this purpose. It is meant for saving files.
Simply add 2 lines of code to LightningModule to track your best metric:
if value < self.best_value:
self.best_value = value
Or if this code is not specific to your LightningModule, you could implement a Callback that tracks your best results.
Then set checkpoint_callback=False in Trainer to prevent it from saving checkpoints by default.
1 Like
Agreed with @awaelchli, but if you still want it just subclass and override these two methods, might work I think.
from pytorch_lightning.utilities import rank_zero_only
from pytorch_lightning.callbacks import ModelCheckpoint
class MyModelCheckpoint(ModelCheckpoint):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
@rank_zero_only
def _del_model(self, *_):
pass
def _save_model(self, *_):
pass
trainer = Trainer(callbacks=[MyModelCheckpoint(monitor='some_metric', ...), ...])
3 Likes