RuntimeError: Trying to backward through the graph a second time when using dataloader #466
Replies: 2 comments
-
So, I fixed it, well now I am even more confused. I changed the loss computation from loss += loss_fn(y_logits, target) to loss = loss_fn(y_logits, target)
train_loss += loss Can I get a clarification of how this solved the problem as I see no difference. Because in my code I am directly adding the loss and, in the solution, the loss is stored in a variable before being added. How could this be the problem? |
Beta Was this translation helpful? Give feedback.
-
Hi @vence-andersen , You've found a great gotcha in PyTorch! While I'm not sure I fully 100% understand it the in most correct way, here's how I think of it. The key is in the error message:
So what happens here is when you calculate the loss with: loss += loss_fn(y_logits, target) And then run: loss.backward() PyTorch behind the scenes only remembers going through the So when you try to recompute it with Essentially calling This is why your first solution errors but your second solution passes (even though they seem similar in regular Python). For a more technical explanation of this and how it works, I'd refer to the following: |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I am working on Chapter 3 excersive, I wrote a model
To train it
This throwed me the error message:
RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.
Then I ran the model in the solutions notebook, which ran without any errors. Can someone please help me understand what's the problem in my code to get such an error? Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions