Non-trivial case of optimizers - learning rate schedulers

Is it possible to have multiple optimizers with learning rate schedulers for only a subset of them? If so, what should configure_optimizers return in this case, given that it returns lists rather than dicts?

def configure_optimizers(self):
   optimizer1 = Adam(...)
   optimizer2 = SGD(...)
   scheduler1 = SomeScheduler(optimizer1, ...)
   return (
       {'optimizer': optimizer1, 'lr_scheduler': scheduler1},
       {'optimizer': optimizer2},
   )

or

def configure_optimizers(self):
   optimizer1 = Adam(...)
   optimizer2 = SGD(...)
   scheduler1 = SomeScheduler(optimizer1, ...)
   return [optimizer1, optimizer2], [scheduler1]
both should work fine since the scheduler knows which optimizer(lr) to update.