Question about LightningAdamW

Hi:
I encountered a NAN problem and needed to add a step to skip learning rate updates, so I wrote the following code, but an error was reported
scale = scaler.get_scale()
scaler.update()
skip_lr_sched = (scale != scaler.get_scale())
if not skip_lr_sched:
lr_sched.step()
AttributeError: ‘LightningAdamW’ object has no attribute ‘get_scale’, i wondering how can i solve this problem?
best regards!