Skip to content

Using the PyTorch API, how to save and load models? #130

Answered by stes
zhuyuhang4 asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @zhuyuhang4 ,

multiple options.

If you want to implement saving/loading logic your self, torch.save and torch.load or pretty much any other serialization approach (e.g. pickle, joblib, ...) will do. You should save model, optimizer, relevant arguments, etc.

To make this a bit more convenient, we implemented loading and saving also directly in the Solver class.

The functions essentially take the state_dict of the solver you are running and serialize these.

If this answers your Q, please mark as an answer :) Otherwise, happy to help further.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by zhuyuhang4
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants