We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 5517240 commit 557a588Copy full SHA for 557a588
README.md
@@ -213,7 +213,7 @@ for t in range(500):
213
214
# Use autograd to compute the backward pass. This call will compute the
215
# gradient of loss with respect to all Variables with requires_grad=True.
216
- # After this call w1.data and w2.data will be Variables holding the gradient
+ # After this call w1.grad and w2.grad will be Variables holding the gradient
217
# of the loss with respect to w1 and w2 respectively.
218
loss.backward()
219
0 commit comments