Specifically, I would like to manually define training and validation loops with for statements using Lightning Fabric, while still benefiting from the convenient callbacks mentioned above.
If this is not possible, are there any convenient packages that can be used in training loops in PyTorch and provide functionality like the callbacks mentioned above? Or do I have to create my own?
You would be able to implement your own checkpointing or earlystopping with this tool. You would call the callback methods in your custom training loop and then you can pass arbitrary callback objects to Fabric.
A out-of-the-box early-stopping or checkpoint callback could be considered in the future.
I will try to use Callbacks in Fabric and I hope that the Callbacks implemented for PyTorch Lightning will be easily available in Fabric, making Fabric even more useful. Thank you!