Skip to content

Commit 38e6e3e

Browse files
authored
Merge pull request #1068 from pritesh2000/gram-1/04
04_pytorch_custom_datasets.ipynb
2 parents 91eb2e0 + 545a460 commit 38e6e3e

File tree

1 file changed

+12
-12
lines changed

1 file changed

+12
-12
lines changed

04_pytorch_custom_datasets.ipynb

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -166,7 +166,7 @@
166166
"source": [
167167
"## 1. Get data\n",
168168
"\n",
169-
"First thing's first we need some data.\n",
169+
"First things first we need some data.\n",
170170
"\n",
171171
"And like any good cooking show, some data has already been prepared for us.\n",
172172
"\n",
@@ -270,7 +270,7 @@
270270
"\n",
271271
"In our case, we have images of pizza, steak and sushi in standard image classification format.\n",
272272
"\n",
273-
"Image classification format contains separate classes of images in seperate directories titled with a particular class name.\n",
273+
"Image classification format contains separate classes of images in separate directories titled with a particular class name.\n",
274274
"\n",
275275
"For example, all images of `pizza` are contained in the `pizza/` directory.\n",
276276
"\n",
@@ -973,7 +973,7 @@
973973
"\n",
974974
"We'll do so using [`torch.utils.data.DataLoader`](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader).\n",
975975
"\n",
976-
"Turning our `Dataset`'s into `DataLoader`'s makes them iterable so a model can go through learn the relationships between samples and targets (features and labels).\n",
976+
"Turning our `Dataset`'s into `DataLoader`'s makes them iterable so a model can go through and learn the relationships between samples and targets (features and labels).\n",
977977
"\n",
978978
"To keep things simple, we'll use a `batch_size=1` and `num_workers=1`.\n",
979979
"\n",
@@ -1759,7 +1759,7 @@
17591759
"source": [
17601760
"They sure do!\n",
17611761
"\n",
1762-
"Let's now take a lot at some other forms of data transforms."
1762+
"Let's now take a look at some other forms of data transforms."
17631763
]
17641764
},
17651765
{
@@ -1778,7 +1778,7 @@
17781778
"\n",
17791779
"Or cropping it or randomly erasing a portion or randomly rotating them.\n",
17801780
"\n",
1781-
"Doing this kinds of transforms is often referred to as **data augmentation**.\n",
1781+
"Doing these kinds of transforms is often referred to as **data augmentation**.\n",
17821782
"\n",
17831783
"**Data augmentation** is the process of altering your data in such a way that you *artificially* increase the diversity of your training set.\n",
17841784
"\n",
@@ -2090,7 +2090,7 @@
20902090
" self.classifier = nn.Sequential(\n",
20912091
" nn.Flatten(),\n",
20922092
" # Where did this in_features shape come from? \n",
2093-
" # It's because each layer of our network compresses and changes the shape of our inputs data.\n",
2093+
" # It's because each layer of our network compresses and changes the shape of our input data.\n",
20942094
" nn.Linear(in_features=hidden_units*16*16,\n",
20952095
" out_features=output_shape)\n",
20962096
" )\n",
@@ -2361,7 +2361,7 @@
23612361
" # 5. Optimizer step\n",
23622362
" optimizer.step()\n",
23632363
"\n",
2364-
" # Calculate and accumulate accuracy metric across all batches\n",
2364+
" # Calculate and accumulate accuracy metrics across all batches\n",
23652365
" y_pred_class = torch.argmax(torch.softmax(y_pred, dim=1), dim=1)\n",
23662366
" train_acc += (y_pred_class == y).sum().item()/len(y_pred)\n",
23672367
"\n",
@@ -2522,7 +2522,7 @@
25222522
"\n",
25232523
"To keep our experiments quick, we'll train our model for **5 epochs** (though you could increase this if you want).\n",
25242524
"\n",
2525-
"As for an **optimizer** and **loss function**, we'll use `torch.nn.CrossEntropyLoss()` (since we're working with multi-class classification data) and `torch.optim.Adam()` with a learning rate of `1e-3` respecitvely.\n",
2525+
"As for an **optimizer** and **loss function**, we'll use `torch.nn.CrossEntropyLoss()` (since we're working with multi-class classification data) and `torch.optim.Adam()` with a learning rate of `1e-3` respectively.\n",
25262526
"\n",
25272527
"To see how long things take, we'll import Python's [`timeit.default_timer()`](https://docs.python.org/3/library/timeit.html#timeit.default_timer) method to calculate the training time."
25282528
]
@@ -2772,7 +2772,7 @@
27722772
"source": [
27732773
"### 8.1 How to deal with overfitting\n",
27742774
"\n",
2775-
"Since the main problem with overfitting is that you're model is fitting the training data *too well*, you'll want to use techniques to \"reign it in\".\n",
2775+
"Since the main problem with overfitting is that your model is fitting the training data *too well*, you'll want to use techniques to \"reign it in\".\n",
27762776
"\n",
27772777
"A common technique of preventing overfitting is known as [**regularization**](https://ml-cheatsheet.readthedocs.io/en/latest/regularization.html).\n",
27782778
"\n",
@@ -2830,7 +2830,7 @@
28302830
"\n",
28312831
"And preventing overfitting and underfitting is possibly the most active area of machine learning research.\n",
28322832
"\n",
2833-
"Since everone wants their models to fit better (less underfitting) but not so good they don't generalize well and perform in the real world (less overfitting).\n",
2833+
"Since everyone wants their models to fit better (less underfitting) but not so good they don't generalize well and perform in the real world (less overfitting).\n",
28342834
"\n",
28352835
"There's a fine line between overfitting and underfitting.\n",
28362836
"\n",
@@ -3180,7 +3180,7 @@
31803180
"\n",
31813181
"Even though our models our performing quite poorly, we can still write code to compare them.\n",
31823182
"\n",
3183-
"Let's first turn our model results in pandas DataFrames."
3183+
"Let's first turn our model results into pandas DataFrames."
31843184
]
31853185
},
31863186
{
@@ -3358,7 +3358,7 @@
33583358
"source": [
33593359
"## 11. Make a prediction on a custom image\n",
33603360
"\n",
3361-
"If you've trained a model on a certain dataset, chances are you'd like to make a prediction on on your own custom data.\n",
3361+
"If you've trained a model on a certain dataset, chances are you'd like to make a prediction on your own custom data.\n",
33623362
"\n",
33633363
"In our case, since we've trained a model on pizza, steak and sushi images, how could we use our model to make a prediction on one of our own images?\n",
33643364
"\n",

0 commit comments

Comments
 (0)