Skip to content

Commit bc304a1

Browse files
authored
Merge pull request #1054 from CarterT27/main
Fix typos and tensorboard installation
2 parents e32e544 + 554233b commit bc304a1

File tree

3 files changed

+17
-5
lines changed

3 files changed

+17
-5
lines changed

07_pytorch_experiment_tracking.ipynb

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -745,7 +745,13 @@
745745
"metadata": {},
746746
"outputs": [],
747747
"source": [
748-
"from torch.utils.tensorboard import SummaryWriter\n",
748+
"try:\n",
749+
" from torch.utils.tensorboard import SummaryWriter\n",
750+
"except:\n",
751+
" print(\"[INFO] Couldn't find tensorboard... installing it.\")\n",
752+
" !pip install -q tensorboard\n",
753+
" from torch.utils.tensorboard import SummaryWriter\n",
754+
"\n",
749755
"\n",
750756
"# Create a writer with all default settings\n",
751757
"writer = SummaryWriter()"
@@ -2298,7 +2304,7 @@
22982304
"source": [
22992305
"Running the cell above we should get an output similar to the following.\n",
23002306
"\n",
2301-
"> **Note:** Depending on the random seeds you used/hardware you used there's a chance your numbers aren't exactly the same as what's here. This is okay. It's due to the inheret randomness of deep learning. What matters most is the trend. Where your numbers are heading. If they're off by a large amount, perhaps there's something wrong and best to go back and check the code. But if they're off by a small amount (say a couple of decimal places or so), that's okay. \n",
2307+
"> **Note:** Depending on the random seeds you used/hardware you used there's a chance your numbers aren't exactly the same as what's here. This is okay. It's due to the inherent randomness of deep learning. What matters most is the trend. Where your numbers are heading. If they're off by a large amount, perhaps there's something wrong and best to go back and check the code. But if they're off by a small amount (say a couple of decimal places or so), that's okay. \n",
23022308
"\n",
23032309
"<img src=\"https://raw.githubusercontent.com/mrdbourke/pytorch-deep-learning/main/images/07-tensorboard-lowest-test-loss.png\" alt=\"various modelling experiments visualized on tensorboard with model that has the lowest test loss highlighted\" width=900/>\n",
23042310
"\n",

docs/02_pytorch_classification.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2391,7 +2391,7 @@
23912391
"\n",
23922392
"PyTorch has a bunch of [ready-made non-linear activation functions](https://pytorch.org/docs/stable/nn.html#non-linear-activations-weighted-sum-nonlinearity) that do similiar but different things. \n",
23932393
"\n",
2394-
"One of the most common and best performing is [ReLU](https://en.wikipedia.org/wiki/Rectifier_(neural_networks) (rectified linear-unit, [`torch.nn.ReLU()`](https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html)).\n",
2394+
"One of the most common and best performing is [ReLU](https://en.wikipedia.org/wiki/Rectifier_(neural_networks)) (rectified linear-unit, [`torch.nn.ReLU()`](https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html)).\n",
23952395
"\n",
23962396
"Rather than talk about it, let's put it in our neural network between the hidden layers in the forward pass and see what happens."
23972397
]

docs/07_pytorch_experiment_tracking.ipynb

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -725,7 +725,13 @@
725725
"metadata": {},
726726
"outputs": [],
727727
"source": [
728-
"from torch.utils.tensorboard import SummaryWriter\n",
728+
"try:\n",
729+
" from torch.utils.tensorboard import SummaryWriter\n",
730+
"except:\n",
731+
" print(\"[INFO] Couldn't find tensorboard... installing it.\")\n",
732+
" !pip install -q tensorboard\n",
733+
" from torch.utils.tensorboard import SummaryWriter\n",
734+
"\n",
729735
"\n",
730736
"# Create a writer with all default settings\n",
731737
"writer = SummaryWriter()"
@@ -2254,7 +2260,7 @@
22542260
"source": [
22552261
"Running the cell above we should get an output similar to the following.\n",
22562262
"\n",
2257-
"> **Note:** Depending on the random seeds you used/hardware you used there's a chance your numbers aren't exactly the same as what's here. This is okay. It's due to the inheret randomness of deep learning. What matters most is the trend. Where your numbers are heading. If they're off by a large amount, perhaps there's something wrong and best to go back and check the code. But if they're off by a small amount (say a couple of decimal places or so), that's okay. \n",
2263+
"> **Note:** Depending on the random seeds you used/hardware you used there's a chance your numbers aren't exactly the same as what's here. This is okay. It's due to the inherent randomness of deep learning. What matters most is the trend. Where your numbers are heading. If they're off by a large amount, perhaps there's something wrong and best to go back and check the code. But if they're off by a small amount (say a couple of decimal places or so), that's okay. \n",
22582264
"\n",
22592265
"<img src=\"https://raw.githubusercontent.com/mrdbourke/pytorch-deep-learning/main/images/07-tensorboard-lowest-test-loss.png\" alt=\"various modelling experiments visualized on tensorboard with model that has the lowest test loss highlighted\" width=900/>\n",
22602266
"\n",

0 commit comments

Comments
 (0)