If I use PyTorch Lightning for large model fine-tuning (LoRA) tasks, and I want to pass an additional parameter p
outside the input-output pair (input, label), to be used in the Linear layer of LoRA for some special tasks during forward pass, how should I proceed?