How to step the optimizer twice inside one training loop?

Hi, I’m trying to implement Adversarial Training using Pytorch-Lightning.

Problem is that, code of adversarial training in Pytorch is:

for i, (data, target) in enumerate(train_dataloader):
    ...
    loss = loss_fn(model(data), target)
    loss.backward()
    optimizer.step()
    ...
    adv_data = perturb(data)
    adv_loss = loss_fn(model(adv_data), target)
    adv_loss .backward()
    optimizer.step()

that is, the optimizer will step twice with two different loss in one single loop. I could use some help on how to implement this using pytorch-lightning.

Thank you!

Hey,

It looks like you need our manual optimization mode in which you can do exactly this!

Let me know if you still have any questions.

Cheers,
Justus