You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
On Mon, Jun 27, 2022 at 6:29 AM Petru-Daniel Tudosiu < ***@***.***> wrote:
The correct way of saving DP/DDP checkpoints is to access the module
parameter of the class.
Please do that instead of saving the whole DP/DDP class' state dict and
then trimming the name.
—
Reply to this email directly, view it on GitHub
<#9>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AC53JXKZXN4G25IOK56QHT3VRF7CDANCNFSM5Z57XOQA>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
Thanks for your message. We followed the official PyTorch ImageNet training code for saving DP/DDP class' state dict. The name prefix trimming method is commonly adopted in other repos. We will add an annotation to this part. If you have further concerns, please initiate a pull request.
The correct way of saving DP/DDP checkpoints is to access the module parameter of the class.
Please do that instead of saving the whole DP/DDP class' state dict and then trimming the name.
The text was updated successfully, but these errors were encountered: