How to switch from optimizer during training

accelerator_backend has been deprecated in favor of accelerator. See isuse here.

If you’re coming here post 1.4.0, you’ll need to use self.trainer.accelerato.setup() instead of self.trainer.accelerator_backend.setup_optimizers(self)

2 Likes