Module not able to find parameters requiring a gradient

I had a model that I was able to run without lightning fairly well, but not able to run inside the lightning framework. I get this error which relates to a case where someone is forward propagating but not training

RuntimeError: DistributedDataParallel is not needed when a module doesn't have any parameter that requires a gradient.

I am training, and have train dataloaders- so I’m not able to understand why I am facing this issue. Any thoughts?

Sorry for not seeing this earlier. I’ve found your issue on GitHub and have answered there: DDP does not work if model has no parameters · Issue #17556 · Lightning-AI/lightning · GitHub