Hi, I face some problems when using LRFinder and FSDP.
As the document pointed, I use
def configure_optimizers(self):
return optim.AdamW(self.trainer.model.parameters(), lr=self.learning_rate, weight_decay=self.weight_decay)
to configure my optimizer. In my main.py, I use
tuner = Tuner(cli.trainer)
tuner.lr_find(cli.model, datamodule=cli.datamodule)
to search the optimal learning rate, and then use
cli.trainer.fit(cli.model, datamodule=cli.datamodule)
to fit the model. However, the fit function failed at such case. I traced back into the details and find that, it seems that in fit function, when calling configure_optimizers hook, the self.trainer.model.parameters() became empty. So I wonder that how should I use LRFinder when applying FSDP. Thanks.