LR Scheduler monitoring multiple metrics

Hey guys,

I have implemented a model that uses the Adam optimizer. I want to set up a scheduler to change the learning rate when two metrics don’t decrease after five times. I tried to use an array with two strings on the monitor parameter, but it didn’t work. This is the method I have implemented:

def configure_optimizers(self):
    optimizer = torch.optim.Adam(
        self.model.parameters(),
        lr=self.learning_rate, betas=(0.5, 0.999))

    scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(
        optimizer, mode="min", factor=0.9, patience=5, verbose=True)

    return {
        "optimizer": optimizer,
        "lr_scheduler" : {
            "scheduler": scheduler,
             "monitor": "val_loss"
        }
    }

I want to monitor both val_loss and val_l2_loss. Is it possible to monitor multiple metrics to change the learning rating of one optimizer?

@fabricionarcizo It is not possible to monitor multiple metrics directly like that, but you should be able to achieve the same by defining a third metric as a function of the other two, and then monitor that. For example:

return {
        "optimizer": optimizer,
        "lr_scheduler" : {
            "scheduler": scheduler,
             "monitor": "val_and_val_l2_loss"
        }
    }

and you do

self.log("val_and_val_l2_loss", val_loss + val_l2_loss)

Do you think that would work for you?

I appreciate your help. This approach doesn’t work to train my model because I need to know precisely which metric doesn’t decrease.