Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions neural_nets_as_distribution_transformers.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ Now we wish to train this net to classify between two data distributions. The go
![The goal of a neural net classifier is to rearrange the input data distribution to match the target label distribution. (left) Input dataset with two classes in red and blue. (right) Target output (one-hot codes).](figures/neural_nets_as_data_transformations/goal_of_classifier.png){#fig-neural_nets_as_data_transformations-goal_of_classifier width=100%}


In this example, the target output places all the red dots at $(0,1)$ and all the blue dots at $(1,0)$. These are the coordinates of one-hot codes for our two classes. Training the network consists of find the series of geometric transformations that rearrange the input distribution to this target output distribution.
In this example, the target output places all the red dots at $(0,1)$ and all the blue dots at $(1,0)$. These are the coordinates of one-hot codes for our two classes. Training the network consists of finding the series of geometric transformations that rearrange the input distribution to this target output distribution.

We will visualize how the net transforms the training dataset, layer by layer, at four **checkpoints** over the course of training.

Expand Down Expand Up @@ -96,4 +96,4 @@ Notice that on the first layer, semantic classes are not well separated but by t

## Concluding Remarks

Layer by layer, deep nets transform data from its raw format to ever more abstracted and useful representations. It can be helpful to think about this process as a set of geometric transformations of a data distribution, or as a kind of disentangling where initially messy data gets reorganized so that different data classes become cleanly separated.
Layer by layer, deep nets transform data from its raw format to ever more abstracted and useful representations. It can be helpful to think about this process as a set of geometric transformations of a data distribution, or as a kind of disentangling where initially messy data gets reorganized so that different data classes become cleanly separated.