Skip to content

Commit 3c56af4

Browse files
committed
adj, verb, typos
1 parent 65ce646 commit 3c56af4

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

07_pytorch_experiment_tracking.ipynb

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -276,7 +276,7 @@
276276
"\n",
277277
"Let's create a function to \"set the seeds\" called `set_seeds()`.\n",
278278
"\n",
279-
"> **Note:** Recall a [random seed](https://en.wikipedia.org/wiki/Random_seed) is a way of flavouring the randomness generated by a computer. They aren't necessary to always set when running machine learning code, however, they help ensure there's an element of reproducibility (the numbers I get with my code are similar to the numbers you get with your code). Outside of an education or experimental setting, random seeds generally aren't required."
279+
"> **Note:** Recalling a [random seed](https://en.wikipedia.org/wiki/Random_seed) is a way of flavouring the randomness generated by a computer. They aren't necessary to always set when running machine learning code, however, they help ensure there's an element of reproducibility (the numbers I get with my code are similar to the numbers you get with your code). Outside of an educational or experimental setting, random seeds generally aren't required."
280280
]
281281
},
282282
{
@@ -313,7 +313,7 @@
313313
"\n",
314314
"So how about we run some experiments and try to further improve our results?\n",
315315
"\n",
316-
"To do so, we'll use similar code to the previous section to download the [`pizza_steak_sushi.zip`](https://github.com/mrdbourke/pytorch-deep-learning/blob/main/data/pizza_steak_sushi.zip) (if the data doesn't already exist) except this time its been functionised.\n",
316+
"To do so, we'll use similar code to the previous section to download the [`pizza_steak_sushi.zip`](https://github.com/mrdbourke/pytorch-deep-learning/blob/main/data/pizza_steak_sushi.zip) (if the data doesn't already exist) except this time it's been functionalised.\n",
317317
"\n",
318318
"This will allow us to use it again later. "
319319
]
@@ -421,7 +421,7 @@
421421
"\n",
422422
"And since we'll be using transfer learning and specifically pretrained models from [`torchvision.models`](https://pytorch.org/vision/stable/models.html), we'll create a transform to prepare our images correctly.\n",
423423
"\n",
424-
"To transform our images in tensors, we can use:\n",
424+
"To transform our images into tensors, we can use:\n",
425425
"1. Manually created transforms using `torchvision.transforms`.\n",
426426
"2. Automatically created transforms using `torchvision.models.MODEL_NAME.MODEL_WEIGHTS.DEFAULT.transforms()`.\n",
427427
" * Where `MODEL_NAME` is a specific `torchvision.models` architecture, `MODEL_WEIGHTS` is a specific set of pretrained weights and `DEFAULT` means the \"best available weights\".\n",
@@ -959,7 +959,7 @@
959959
"source": [
960960
"> **Note:** You might notice the results here are slightly different to what our model got in 06. PyTorch Transfer Learning. The difference comes from using the `engine.train()` and our modified `train()` function. Can you guess why? The [PyTorch documentation on randomness](https://pytorch.org/docs/stable/notes/randomness.html) may help more.\n",
961961
"\n",
962-
"Running the cell above we get similar outputs we got in [06. PyTorch Transfer Learning section 4: Train model](https://www.learnpytorch.io/06_pytorch_transfer_learning/#4-train-model) but the difference is behind the scenes our `writer` instance has created a `runs/` directory storing our model's results.\n",
962+
"Running the cell above we get similar outputs we got in [06. PyTorch Transfer Learning section 4: Train model](https://www.learnpytorch.io/06_pytorch_transfer_learning/#4-train-model) but the difference is that behind the scenes our `writer` instance has created a `runs/` directory storing our model's results.\n",
963963
"\n",
964964
"For example, the save location might look like:\n",
965965
"\n",

0 commit comments

Comments
 (0)