How to switch from optimizer during training

Original answer from @goku :rocket:

I think this logic can now better be done in configure_optimizers itself in case someone has some crazy schedulers, or schedulers_dict as well and calling:

def on_epoch_start(self):
    if condition:
        self.trainer.accelerated_backend.setup_optimizers(self)

def configure_optimizers(self):
    if condition:
        return Adam(...)
    else:
        return LBFGS(...)