PTL’s checkpoint are not exactly the same as PyTorch’s save_model
! Therefore, a simple torch.load
won’t work.
You need some tricks to do so; it depends on how the model is defined. If you are passing the model as an argument, you might wanna look at the following post.
Or post a link to your colab notebook, and I will be happy to help you out (since the information provided isn’t enough).