Skip to content

Commit b99a203

Browse files
authored
Merge pull request #1086 from tberends/main
Additional explaination for the training loop
2 parents 8974543 + 81a7b92 commit b99a203

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

01_pytorch_workflow.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -880,7 +880,7 @@
880880
">\n",
881881
"> And on the ordering of things, the above is a good default order but you may see slightly different orders. Some rules of thumb: \n",
882882
"> * Calculate the loss (`loss = ...`) *before* performing backpropagation on it (`loss.backward()`).\n",
883-
"> * Zero gradients (`optimizer.zero_grad()`) *before* stepping them (`optimizer.step()`).\n",
883+
"> * Zero gradients (`optimizer.zero_grad()`) *before* computing the gradients of the loss with respect to every model parameter (`loss.backward()`).\n",
884884
"> * Step the optimizer (`optimizer.step()`) *after* performing backpropagation on the loss (`loss.backward()`).\n",
885885
"\n",
886886
"For resources to help understand what's happening behind the scenes with backpropagation and gradient descent, see the extra-curriculum section.\n"

docs/01_pytorch_workflow.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -881,7 +881,7 @@
881881
">\n",
882882
"> And on the ordering of things, the above is a good default order but you may see slightly different orders. Some rules of thumb: \n",
883883
"> * Calculate the loss (`loss = ...`) *before* performing backpropagation on it (`loss.backward()`).\n",
884-
"> * Zero gradients (`optimizer.zero_grad()`) *before* stepping them (`optimizer.step()`).\n",
884+
"> * Zero gradients (`optimizer.zero_grad()`) *before* computing the gradients of the loss with respect to every model parameter (`loss.backward()`).\n",
885885
"> * Step the optimizer (`optimizer.step()`) *after* performing backpropagation on the loss (`loss.backward()`).\n",
886886
"\n",
887887
"For resources to help understand what's happening behind the scenes with backpropagation and gradient descent, see the extra-curriculum section.\n"

0 commit comments

Comments
 (0)