Save/load model for inference

ahh thank you!

and that would then get loaded with model.load_from_checkpoint(), and no need to save a separate config file to disk?