Skip to content

Commit 6a585e0

Browse files
committed
Add appendix D
1 parent 8c1871f commit 6a585e0

File tree

4 files changed

+1236
-16
lines changed

4 files changed

+1236
-16
lines changed

README.md

+15-16
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ This repository contains the code for coding, pretraining, and finetuning a GPT-
99

1010
<a href="http://mng.bz/orYv"><img src="images/cover.jpg" width="250px"></a>
1111

12-
In [*Build a Large Language Model (from Scratch)*](http://mng.bz/orYv), you'll discover how LLMs work from the inside out. In this book, I'll guide you step by step through creating your own LLM, explaining each stage with clear text, diagrams, and examples.
12+
In [*Build a Large Language Model (From Scratch)*](http://mng.bz/orYv), you'll discover how LLMs work from the inside out. In this book, I'll guide you step by step through creating your own LLM, explaining each stage with clear text, diagrams, and examples.
1313

1414
The method described in this book for training and developing your own small-but-functional model for educational purposes mirrors the approach used in creating large-scale foundational models such as those behind ChatGPT.
1515

@@ -31,21 +31,20 @@ Alternatively, you can view this and other files on GitHub at [https://github.co
3131
<br>
3232
<br>
3333

34-
| Chapter Title | Main Code (for quick access) | All Code + Supplementary |
35-
|------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------|-------------------------------|
36-
| Ch 1: Understanding Large Language Models | No code | No code |
37-
| Ch 2: Working with Text Data | - [ch02.ipynb](ch02/01_main-chapter-code/ch02.ipynb)<br/>- [dataloader.ipynb](ch02/01_main-chapter-code/dataloader.ipynb) (summary)<br/>- [exercise-solutions.ipynb](ch02/01_main-chapter-code/exercise-solutions.ipynb) | [./ch02](./ch02) |
38-
| Ch 3: Coding Attention Mechanisms | - [ch03.ipynb](ch03/01_main-chapter-code/ch03.ipynb)<br/>- [multihead-attention.ipynb](ch03/01_main-chapter-code/multihead-attention.ipynb) (summary) <br/>- [exercise-solutions.ipynb](ch03/01_main-chapter-code/exercise-solutions.ipynb)| [./ch03](./ch03) |
39-
| Ch 4: Implementing a GPT Model from Scratch | - [ch04.ipynb](ch04/01_main-chapter-code/ch04.ipynb)<br/>- [gpt.py](ch04/01_main-chapter-code/gpt.py) (summary)<br/>- [exercise-solutions.ipynb](ch04/01_main-chapter-code/exercise-solutions.ipynb) | [./ch04](./ch04) |
40-
| Ch 5: Pretraining on Unlabeled Data | Q1 2024 | ... |
41-
| Ch 6: Finetuning for Text Classification | Q2 2024 | ... |
42-
| Ch 7: Finetuning with Human Feedback | Q2 2024 | ... |
43-
| Ch 8: Using Large Language Models in Practice | Q2/3 2024 | ... |
44-
| Appendix A: Introduction to PyTorch | - [code-part1.ipynb](appendix-A/03_main-chapter-code/code-part1.ipynb)<br/>- [code-part2.ipynb](appendix-A/03_main-chapter-code/code-part2.ipynb)<br/>- [DDP-script.py](appendix-A/03_main-chapter-code/DDP-script.py)<br/>- [exercise-solutions.ipynb](appendix-A/03_main-chapter-code/exercise-solutions.ipynb) | [./appendix-A](./appendix-A) |
45-
| Appendix B: References and Further Reading | No code | |
46-
| Appendix C: Exercises | No code | |
47-
48-
34+
| Chapter Title | Main Code (for quick access) | All Code + Supplementary |
35+
|------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------|-------------------------------|
36+
| Ch 1: Understanding Large Language Models | No code | - |
37+
| Ch 2: Working with Text Data | - [ch02.ipynb](ch02/01_main-chapter-code/ch02.ipynb)<br/>- [dataloader.ipynb](ch02/01_main-chapter-code/dataloader.ipynb) (summary)<br/>- [exercise-solutions.ipynb](ch02/01_main-chapter-code/exercise-solutions.ipynb) | [./ch02](./ch02) |
38+
| Ch 3: Coding Attention Mechanisms | - [ch03.ipynb](ch03/01_main-chapter-code/ch03.ipynb)<br/>- [multihead-attention.ipynb](ch03/01_main-chapter-code/multihead-attention.ipynb) (summary) <br/>- [exercise-solutions.ipynb](ch03/01_main-chapter-code/exercise-solutions.ipynb)| [./ch03](./ch03) |
39+
| Ch 4: Implementing a GPT Model from Scratch | - [ch04.ipynb](ch04/01_main-chapter-code/ch04.ipynb)<br/>- [gpt.py](ch04/01_main-chapter-code/gpt.py) (summary)<br/>- [exercise-solutions.ipynb](ch04/01_main-chapter-code/exercise-solutions.ipynb) | [./ch04](./ch04) |
40+
| Ch 5: Pretraining on Unlabeled Data | Q1 2024 | ... |
41+
| Ch 6: Finetuning for Text Classification | Q2 2024 | ... |
42+
| Ch 7: Finetuning with Human Feedback | Q2 2024 | ... |
43+
| Ch 8: Using Large Language Models in Practice | Q2/3 2024 | ... |
44+
| Appendix A: Introduction to PyTorch | - [code-part1.ipynb](appendix-A/03_main-chapter-code/code-part1.ipynb)<br/>- [code-part2.ipynb](appendix-A/03_main-chapter-code/code-part2.ipynb)<br/>- [DDP-script.py](appendix-A/03_main-chapter-code/DDP-script.py)<br/>- [exercise-solutions.ipynb](appendix-A/03_main-chapter-code/exercise-solutions.ipynb) | [./appendix-A](./appendix-A) |
45+
| Appendix B: References and Further Reading | No code | - |
46+
| Appendix C: Exercises | No code | - |
47+
| Appendix D: Adding Bells and Whistles to the Training Loop | - [appendix-D.ipynb](appendix-D/01_main-chapter-code/appendix-D.ipynb) | [./appendix-D](./appendix-D) |
4948
<br>
5049

5150
> [!TIP]

0 commit comments

Comments
 (0)