How to customize trainer in order to restrict parameter range during training?

hey @Unseo

you can configure optimizer_step hook within LightningModule to achieve this.

class LitModel(LightningModule):
    def optimizer_step(self, *args, **kwargs):
        super().optimizer_step(*args, **kwargs)  # <- parameters are updated here
        self.layer.clamp(...)  # <- now clamp the parameters

Also, we have moved the discussions to GitHub Discussions. You might want to check that out instead to get a quick response. The forums will be marked read-only soon.

Thank you