Backward twice in one training_step

I have 2 losses for my model.
And I need the grads of the first loss to compute the second one.
The pseudocode in pytorch is like:

optimizer.zero_grad()
hidden_value = model.part1(input)
output = model.part2(hidden_value)
loss1 = criterion(output, label)
loss1.backward(retain_graph=True)
loss2 = criterion2(hidden_value.grad, label2)
loss2.backward()
optimizer.step()

I found an API named manual_backward() which may fit my problem.
However, I build this model on a project based on pytorch_lighting 0.6.0, and it doesn’t have this API.
So, my questions are:
1.How can I implement my operation using pytorch_lightning 0.6.0?
2.If I can’t implement it in pytorch_lightning 0.6.0, which lighting version should I chose? (Please recommend a close version which may cause less error after I update the lightning.)