Do I need to detach when using self.logger.experiment.add_scalars?

I am aware that when we use self.log("train_loss",loss) for instance, the loss tensor is automatically detached to avoid CPU RAM leak.

However, if I am logging something else through the method self.logger.experiment.add_scalars() or self.logger.experiment.add_image() , do I need to detach what is being logged manually ?

It depends on the logger you are using. Afaik TensorBoard will do it.