Using the PyTorch API, how to save and load models? #130
-
Using the PyTorch API, how to save and load models? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi @zhuyuhang4 , multiple options. If you want to implement saving/loading logic your self, To make this a bit more convenient, we implemented loading and saving also directly in the The functions essentially take the If this answers your Q, please mark as an answer :) Otherwise, happy to help further. |
Beta Was this translation helpful? Give feedback.
Hi @zhuyuhang4 ,
multiple options.
If you want to implement saving/loading logic your self,
torch.save
andtorch.load
or pretty much any other serialization approach (e.g. pickle, joblib, ...) will do. You should save model, optimizer, relevant arguments, etc.To make this a bit more convenient, we implemented loading and saving also directly in the
Solver
class.The functions essentially take the
state_dict
of the solver you are running and serialize these.If this answers your Q, please mark as an answer :) Otherwise, happy to help further.