Implement SCHEDULER OPTIMIZER in Pytorch Lightning

Hi guys , I try to implement the optimizer in this code. I already specify some variable :

base_lr = 4.8
final_lr = 0
warmup_epochs = 10
start_warmup = 0 
epochs = 100 
weight_decay = 1e-6
params = model.parameters()
optimizer = torch.optim.SGD(
    params,
    lr= base_lr,
    momentum=0.9,
    weight_decay= weight_decay,
)
optimizer = LARC(optimizer=optimizer, trust_coefficient=0.001, clip=False)
warmup_lr_schedule = np.linspace(start_warmup, base_lr, len(train_loader) * warmup_epochs)
iters = np.arange(len(train_loader) * (epochs - warmup_epochs))
cosine_lr_schedule = np.array([final_lr + 0.5 * (base_lr - final_lr) * (1 + \
                     math.cos(math.pi * t / (len(train_loader) * (epochs - warmup_epochs)))) for t in iters])
lr_schedule = np.concatenate((warmup_lr_schedule, cosine_lr_schedule))

When training begins, the learning rate is update base on the epoch and iteration numbers :

for epoch in range(epochs + 1):  # loop over the dataset multiple times
    total_loss = 0.0
    for it, data in enumerate(train_loader):
        # get the inputs; data is a list of [inputs, labels]
        iteration = epoch * len(train_loader) + it
        for param_group in optimizer.param_groups:
            param_group["lr"] = lr_schedule[iteration]

I wonder can we implement this optimizer with configure_optimizer function in Pytorch-Lightning like this :

def configure_optimizers(self):
    return Adam(self.parameters(), lr=1e-3)

If possible, can u please give me some pseudo-code for this implementation? I have read many documents about this function but still can’t understand it.

Thanks for reading.