Hi,
I use a binary mask to mark the parameters to update. Using PyTorch, I can write like
for name, param in model.named_parameters():
if param.grad is not None:
param.grad *= mask[name]
optimizer.step()
But I don’t know how to achieve this in lightning Trainer.