Skip to content

Commit 45a24e1

Browse files
committed
intro-to-pytorch: part 3, links fix
1 parent cb04615 commit 45a24e1

2 files changed

+6
-6
lines changed

intro-to-pytorch/Part 3 - Training Neural Networks (Exercises).ipynb

+3-3
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@
6464
"\n",
6565
"Let's start by seeing how we calculate the loss with PyTorch. Through the `nn` module, PyTorch provides losses such as the cross-entropy loss (`nn.CrossEntropyLoss`). You'll usually see the loss assigned to `criterion`. As noted in the last part, with a classification problem such as MNIST, we're using the softmax function to predict class probabilities. With a softmax output, you want to use cross-entropy as the loss. To actually calculate the loss, you first define the criterion then pass in the output of your network and the correct labels.\n",
6666
"\n",
67-
"Something really important to note here. Looking at [the documentation for `nn.CrossEntropyLoss`](https://pytorch.org/docs/stable/nn.html#torch.nn.CrossEntropyLoss),\n",
67+
"Something really important to note here. Looking at [the documentation for `nn.CrossEntropyLoss`](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss),\n",
6868
"\n",
6969
"> This criterion combines `nn.LogSoftmax()` and `nn.NLLLoss()` in one single class.\n",
7070
">\n",
@@ -505,7 +505,7 @@
505505
],
506506
"metadata": {
507507
"kernelspec": {
508-
"display_name": "Python 3",
508+
"display_name": "Python 3 (ipykernel)",
509509
"language": "python",
510510
"name": "python3"
511511
},
@@ -519,7 +519,7 @@
519519
"name": "python",
520520
"nbconvert_exporter": "python",
521521
"pygments_lexer": "ipython3",
522-
"version": "3.7.1"
522+
"version": "3.8.10"
523523
}
524524
},
525525
"nbformat": 4,

intro-to-pytorch/Part 3 - Training Neural Networks (Solution).ipynb

+3-3
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@
6464
"\n",
6565
"Let's start by seeing how we calculate the loss with PyTorch. Through the `nn` module, PyTorch provides losses such as the cross-entropy loss (`nn.CrossEntropyLoss`). You'll usually see the loss assigned to `criterion`. As noted in the last part, with a classification problem such as MNIST, we're using the softmax function to predict class probabilities. With a softmax output, you want to use cross-entropy as the loss. To actually calculate the loss, you first define the criterion then pass in the output of your network and the correct labels.\n",
6666
"\n",
67-
"Something really important to note here. Looking at [the documentation for `nn.CrossEntropyLoss`](https://pytorch.org/docs/stable/nn.html#torch.nn.CrossEntropyLoss),\n",
67+
"Something really important to note here. Looking at [the documentation for `nn.CrossEntropyLoss`](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss),\n",
6868
"\n",
6969
"> This criterion combines `nn.LogSoftmax()` and `nn.NLLLoss()` in one single class.\n",
7070
">\n",
@@ -658,7 +658,7 @@
658658
],
659659
"metadata": {
660660
"kernelspec": {
661-
"display_name": "Python 3",
661+
"display_name": "Python 3 (ipykernel)",
662662
"language": "python",
663663
"name": "python3"
664664
},
@@ -672,7 +672,7 @@
672672
"name": "python",
673673
"nbconvert_exporter": "python",
674674
"pygments_lexer": "ipython3",
675-
"version": "3.7.1"
675+
"version": "3.8.10"
676676
}
677677
},
678678
"nbformat": 4,

0 commit comments

Comments
 (0)