Skip loss backward and optimizer step if loss undefined for one batch

I have this edge case where i train a RNN for several batches, but sometimes it happens for one batch that there is no ground truth available (frames are not annotated). However I still want to run the network in order to update the hidden states.

Is it possible to override both backward and optimizer step if loss is None or undefined?

Hello, my apology for the late reply. We are slowly converging to deprecate this forum in favor of the GH build-in version… Could we kindly ask you to recreate your question there - Lightning Discussions