We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent cf17aa8 commit c4b1d3dCopy full SHA for c4b1d3d
autograd/two_layer_net_autograd.py
@@ -57,7 +57,7 @@
57
58
# Use autograd to compute the backward pass. This call will compute the
59
# gradient of loss with respect to all Variables with requires_grad=True.
60
- # After this call w1.data and w2.data will be Variables holding the gradient
+ # After this call w1.grad and w2.grad will be Variables holding the gradient
61
# of the loss with respect to w1 and w2 respectively.
62
loss.backward()
63
0 commit comments