`.detach()` cannot stop backprop in `training_step`

the reason is when training_step is called with optimizer_idx=0, the parameters passed in other optimizers (in this case it optimizer_idx=1) are turned off in lightning by default.

check this: https://pytorch-lightning.readthedocs.io/en/latest/api/pytorch_lightning.core.lightning.html#pytorch_lightning.core.lightning.LightningModule.toggle_optimizer

so when you do self.generated_imgs = self(z), then generator param gradients are turned off.

either you can do self.disc(self(z)) by storing z with optimizer_idx=0 or change the default behaviour by overriding toggle_optimizer.

1 Like