Lazy_load multiple lora?

I have finetuned 2 lora lit_model_lora_finetuned_A.pth and lit_model_lora_finetuned_B.pth…

Here is my code for merging lora with the Llama model:


checkpoint = lazy_load(checkpoint_path)
lora_checkpoint = lazy_load(lora_path)
checkpoint.update(lora_checkpoint.get("model", lora_checkpoint))
model.load_state_dict(checkpoint)
fabric.print(
f"Time to load the model weights: {time.perf_counter() - t0:.02f} seconds.",
file=sys.stderr,

)

model.eval()
merge_lora_weights(model)
model = fabric.setup(model)
tokenizer_ticker = Tokenizer(checkpoint_dir)

Is there a way to merge my 2 lora models ? What would be the impact on final results ?
Greetings,