Hey guys,
I have implemented a model that uses the Adam
optimizer. I want to set up a scheduler to change the learning rate when two metrics don’t decrease after five times. I tried to use an array with two strings on the monitor
parameter, but it didn’t work. This is the method I have implemented:
def configure_optimizers(self):
optimizer = torch.optim.Adam(
self.model.parameters(),
lr=self.learning_rate, betas=(0.5, 0.999))
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(
optimizer, mode="min", factor=0.9, patience=5, verbose=True)
return {
"optimizer": optimizer,
"lr_scheduler" : {
"scheduler": scheduler,
"monitor": "val_loss"
}
}
I want to monitor both val_loss
and val_l2_loss
. Is it possible to monitor multiple metrics to change the learning rating of one optimizer?