the reason is when training_step
is called with optimizer_idx=0
, the parameters passed in other optimizers (in this case it optimizer_idx=1) are turned off in lightning by default.
so when you do self.generated_imgs = self(z)
, then generator param gradients are turned off.
either you can do self.disc(self(z))
by storing z
with optimizer_idx=0
or change the default behaviour by overriding toggle_optimizer
.