We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
2 parents 73a662b + 29e58eb commit 0003fcdCopy full SHA for 0003fcd
README.md
@@ -536,7 +536,7 @@ loss_fn = torch.nn.MSELoss(reduction='sum')
536
537
# Use the optim package to define an Optimizer that will update the weights of
538
# the model for us. Here we will use Adam; the optim package contains many other
539
-# optimization algoriths. The first argument to the Adam constructor tells the
+# optimization algorithms. The first argument to the Adam constructor tells the
540
# optimizer which Tensors it should update.
541
learning_rate = 1e-4
542
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)
0 commit comments