Skip to content

Commit 555db9a

Browse files
committed
2 parents 0d8adca + 659a73c commit 555db9a

File tree

3 files changed

+23
-2
lines changed

3 files changed

+23
-2
lines changed

LICENSE

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
MIT License
2+
3+
Copyright (c) 2017 Justin Johnson
4+
5+
Permission is hereby granted, free of charge, to any person obtaining a copy
6+
of this software and associated documentation files (the "Software"), to deal
7+
in the Software without restriction, including without limitation the rights
8+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9+
copies of the Software, and to permit persons to whom the Software is
10+
furnished to do so, subject to the following conditions:
11+
12+
The above copyright notice and this permission notice shall be included in all
13+
copies or substantial portions of the Software.
14+
15+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21+
SOFTWARE.

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -209,7 +209,7 @@ for t in range(500):
209209

210210
# Use autograd to compute the backward pass. This call will compute the
211211
# gradient of loss with respect to all Variables with requires_grad=True.
212-
# After this call w1.data and w2.data will be Variables holding the gradient
212+
# After this call w1.grad and w2.grad will be Variables holding the gradient
213213
# of the loss with respect to w1 and w2 respectively.
214214
loss.backward()
215215

autograd/two_layer_net_autograd.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@
5353

5454
# Use autograd to compute the backward pass. This call will compute the
5555
# gradient of loss with respect to all Variables with requires_grad=True.
56-
# After this call w1.data and w2.data will be Variables holding the gradient
56+
# After this call w1.grad and w2.grad will be Variables holding the gradient
5757
# of the loss with respect to w1 and w2 respectively.
5858
loss.backward()
5959

0 commit comments

Comments
 (0)