LR_Finder for FSDP

Hi, I face some problems when using LRFinder and FSDP.
As the document pointed, I use

    def configure_optimizers(self):
        return optim.AdamW(self.trainer.model.parameters(), lr=self.learning_rate, weight_decay=self.weight_decay)

to configure my optimizer. In my main.py, I use

    tuner = Tuner(cli.trainer)
    tuner.lr_find(cli.model, datamodule=cli.datamodule)

to search the optimal learning rate, and then use
cli.trainer.fit(cli.model, datamodule=cli.datamodule)
to fit the model. However, the fit function failed at such case. I traced back into the details and find that, it seems that in fit function, when calling configure_optimizers hook, the self.trainer.model.parameters() became empty. So I wonder that how should I use LRFinder when applying FSDP. Thanks.

Hi
FSDP is still in an experimental phase, so I am not surprised that this comes up. You are probably one of the first to try LR-finder with FSDP :))
I think the best is to create a bug report issue on Lightning GitHub so we can make sure this can be supported in the future.