|
276 | 276 | "\n",
|
277 | 277 | "Let's create a function to \"set the seeds\" called `set_seeds()`.\n",
|
278 | 278 | "\n",
|
279 |
| - "> **Note:** Recall a [random seed](https://en.wikipedia.org/wiki/Random_seed) is a way of flavouring the randomness generated by a computer. They aren't necessary to always set when running machine learning code, however, they help ensure there's an element of reproducibility (the numbers I get with my code are similar to the numbers you get with your code). Outside of an education or experimental setting, random seeds generally aren't required." |
| 279 | + "> **Note:** Recalling a [random seed](https://en.wikipedia.org/wiki/Random_seed) is a way of flavouring the randomness generated by a computer. They aren't necessary to always set when running machine learning code, however, they help ensure there's an element of reproducibility (the numbers I get with my code are similar to the numbers you get with your code). Outside of an educational or experimental setting, random seeds generally aren't required." |
280 | 280 | ]
|
281 | 281 | },
|
282 | 282 | {
|
|
313 | 313 | "\n",
|
314 | 314 | "So how about we run some experiments and try to further improve our results?\n",
|
315 | 315 | "\n",
|
316 |
| - "To do so, we'll use similar code to the previous section to download the [`pizza_steak_sushi.zip`](https://github.com/mrdbourke/pytorch-deep-learning/blob/main/data/pizza_steak_sushi.zip) (if the data doesn't already exist) except this time its been functionised.\n", |
| 316 | + "To do so, we'll use similar code to the previous section to download the [`pizza_steak_sushi.zip`](https://github.com/mrdbourke/pytorch-deep-learning/blob/main/data/pizza_steak_sushi.zip) (if the data doesn't already exist) except this time it's been functionalised.\n", |
317 | 317 | "\n",
|
318 | 318 | "This will allow us to use it again later. "
|
319 | 319 | ]
|
|
421 | 421 | "\n",
|
422 | 422 | "And since we'll be using transfer learning and specifically pretrained models from [`torchvision.models`](https://pytorch.org/vision/stable/models.html), we'll create a transform to prepare our images correctly.\n",
|
423 | 423 | "\n",
|
424 |
| - "To transform our images in tensors, we can use:\n", |
| 424 | + "To transform our images into tensors, we can use:\n", |
425 | 425 | "1. Manually created transforms using `torchvision.transforms`.\n",
|
426 | 426 | "2. Automatically created transforms using `torchvision.models.MODEL_NAME.MODEL_WEIGHTS.DEFAULT.transforms()`.\n",
|
427 | 427 | " * Where `MODEL_NAME` is a specific `torchvision.models` architecture, `MODEL_WEIGHTS` is a specific set of pretrained weights and `DEFAULT` means the \"best available weights\".\n",
|
|
959 | 959 | "source": [
|
960 | 960 | "> **Note:** You might notice the results here are slightly different to what our model got in 06. PyTorch Transfer Learning. The difference comes from using the `engine.train()` and our modified `train()` function. Can you guess why? The [PyTorch documentation on randomness](https://pytorch.org/docs/stable/notes/randomness.html) may help more.\n",
|
961 | 961 | "\n",
|
962 |
| - "Running the cell above we get similar outputs we got in [06. PyTorch Transfer Learning section 4: Train model](https://www.learnpytorch.io/06_pytorch_transfer_learning/#4-train-model) but the difference is behind the scenes our `writer` instance has created a `runs/` directory storing our model's results.\n", |
| 962 | + "Running the cell above we get similar outputs we got in [06. PyTorch Transfer Learning section 4: Train model](https://www.learnpytorch.io/06_pytorch_transfer_learning/#4-train-model) but the difference is that behind the scenes our `writer` instance has created a `runs/` directory storing our model's results.\n", |
963 | 963 | "\n",
|
964 | 964 | "For example, the save location might look like:\n",
|
965 | 965 | "\n",
|
|
0 commit comments