Skip to content

Commit 0003fcd

Browse files
authored
Merge pull request jcjohnson#35 from Aman-1412/master
Teeny-tiny Typo
2 parents 73a662b + 29e58eb commit 0003fcd

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -536,7 +536,7 @@ loss_fn = torch.nn.MSELoss(reduction='sum')
536536

537537
# Use the optim package to define an Optimizer that will update the weights of
538538
# the model for us. Here we will use Adam; the optim package contains many other
539-
# optimization algoriths. The first argument to the Adam constructor tells the
539+
# optimization algorithms. The first argument to the Adam constructor tells the
540540
# optimizer which Tensors it should update.
541541
learning_rate = 1e-4
542542
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)

0 commit comments

Comments
 (0)