How to switch from optimizer during training

I am using the following script to switch between optimizers, however getting this error:
" self.trainer.accelerator_backend.setup_optimizers(self)
AttributeError: ‘CPUBackend’ object has no attribute ‘setup_optimizers’"

def configure_optimizers(self):
      if self.current_epoch % 2 == 0:
            return optim.Adam(self.parameters(), lr=self.hparams.lr)
       else:
            return optim.SGD(self.parameters(), lr=self.hparams.lr)

def on_epoch_start(self):
    self.trainer.accelerator_backend.setup_optimizers(self)