Error in `lr_scheduler_step()` function

Here is the definitions of configure_optimizers() and lr_scheduler_step() for my custom pl lightningmodule.

def configure_optimizers(self):
    opt_args = self.config.optimizer_args
    ## hardcoded. Need to read all parameters directly from config file
    optimizer = torch.optim.Adam(self.parameters(),
                            lr = opt_args.base_lr,
                            weight_decay = opt_args.weight_decay)
    scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=opt_args.step, gamma=0.2)

    return {'optimizer': optimizer, 'lr_scheduler': scheduler}

  def lr_scheduler_step(self, scheduler):

On running the program, I get the following error

File "", line 42, in <module>, datamodule=datamodule)
TypeError: lr_scheduler_step() takes 2 positional arguments but 4 were given

What am I doing wrong?

You didn’t specify all arguments in the signature of that hook. Since you don’t need them, you can just do the following:

def lr_scheduler_step(self, scheduler, *args, **kwargs):