hi @Seungyoung_Park, for Accumulated Gradients you could also use accumulate_grad_batches
flag in the PyTorch Lightning Trainer. You can check it in the docs here.
Also, we have migrated to Github Discussions. To get quicker response please post your questions there.
Thanks