Skip to content

Commit 11ca756

Browse files
Update README
1 parent 44733ab commit 11ca756

File tree

1 file changed

+108
-12
lines changed

1 file changed

+108
-12
lines changed

README.md

Lines changed: 108 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -6,19 +6,26 @@ In addition to simple model composition, this library aims to support extracting
66

77
Hasktorch Compose is an experimental library built on top of [hasktorch-skeleton](https://github.com/hasktorch/hasktorch-skeleton).
88

9-
**Planned Features:**
10-
- [x] Sequential
11-
- [ ] Extract layer
9+
**Features:**
10+
- [x] Sequential composition of layers (using HList)
11+
- [x] Extracting individual layers
12+
- [x] Adding/Dropping/Replacing layers in a sequence
13+
- [x] Inspecting layer output shapes
14+
- [x] Concatenate layer (`:++:`)
15+
- [x] Fanout layer (`://:`)
16+
- [x] Fanin layer (`:+:`)
17+
- [x] Shortcut layer
18+
- [x] Replicate layer
1219
- [ ] Test for each layer
1320
- [ ] Overlay layer
14-
- [x] Concatenate layer
21+
1522

1623
# Examples
1724

18-
## Sequential
25+
## Sequential Composition
1926

20-
Use `.*.` operator of HList to join layers.
21-
This is an example of an MLP implementation, created by combining LinearSpec.
27+
Use the `.*.` operator of HList to join layer specifications.
28+
This example shows how to create a simple Multi-Layer Perceptron (MLP) by combining `LinearSpec` layers with `ReluSpec` activations.
2229

2330
```haskell
2431
newtype MLPSpec = MLPSpec (HList [LinearSpec, ReluSpec, LinearSpec, ReluSpec, LinearSpec, LogSoftMaxSpec]) deriving (Generic, Show, Eq)
@@ -45,13 +52,102 @@ mlp :: MLP -> Tensor -> Tensor
4552
mlp = forward
4653
```
4754

48-
## Extract layer
55+
## Extracting Layers
56+
57+
You can extract the first or last layer of a composed model using `getFirstLayer` and `getLastLayer` respectively. You can also drop the first or last layer with `dropFirstLayer` and `dropLastLayer` functions.
58+
59+
```haskell
60+
-- Assume 'model' is an instance of `MLP`
61+
(MLP (model :: a)) <- sample mlpSpec
62+
let firstLayer = getFirstLayer model -- Get the first Linear layer
63+
let lastLayer = getLastLayer model -- Get the LogSoftmax layer
64+
let modelWithoutLast = dropLastLayer model
65+
```
66+
67+
## Modifying Layers
68+
69+
Layers can be added to a sequence using `addLastLayer`.
70+
71+
```haskell
72+
-- Assume 'model' is an instance of `MLP`
73+
model <- sample mlpSpec
74+
let modelWithoutLast = dropLastLayer model
75+
let lastLayer = getLastLayer model
76+
let modifiedModel = addLastLayer modelWithoutLast lastLayer -- add the last layer back
77+
```
78+
79+
## Inspecting Output Shapes
80+
81+
The `toOutputShapes` function allows you to get the shapes of each layer's output for a given input. This is useful for debugging and understanding the data flow in a model.
82+
83+
```haskell
84+
-- Assume 'model' is an instance of `MLP`
85+
(MLP model) <- sample mlpSpec
86+
let input = ones' [2,784]
87+
let outputShapes = toOutputShapes model input
88+
-- outputShapes will be a HList containing the shape of each layer's output.
89+
```
90+
91+
## Concatenate Layer
4992

50-
## Test for each layer
93+
The `Concat` type (using `:++:` as the infix constructor) allows you to combine two models that operate on different inputs.
5194

52-
For one input, take the outputs of all layers, then compare the shapes and values of all the layers.
95+
```haskell
96+
-- Assume 'm0' and 'm1' are models, and 'a0' and 'a1' are inputs.
97+
let concatenatedModel = Concat m0 m1
98+
let (b0, b1) = forward concatenatedModel (a0, a1)
99+
```
53100

54-
## Overlay layer
101+
## Fanout Layer
55102

56-
## Concatenate layer
103+
The `Fanout` type (using `://:` as the infix constructor) allows you to apply different models to the same input.
104+
105+
```haskell
106+
-- Assume 'm0' and 'm1' are models, and 'a' is the input.
107+
let fanoutModel = Fanout m0 m1
108+
let (b0, b1) = forward fanoutModel a
109+
```
110+
111+
## Fanin Layer
112+
113+
The `Fanin` type (using `:+:` as the infix constructor) allows you to combine the outputs of different models using element-wise addition.
114+
115+
```haskell
116+
-- Assume 'm0' and 'm1' are models, and 'a' and 'b' are inputs.
117+
let faninModel = Fanin m0 m1
118+
let c = forward faninModel (a, b)
119+
```
120+
121+
## Shortcut Layer
122+
The `Shortcut` layer allows you to implement a residual connection by adding the input to the output of a given model.
123+
124+
```haskell
125+
-- Assume 'model' is an instance of some model
126+
let shortcutModel = Shortcut model
127+
let output = forward shortcutModel input
128+
-- output == forward model input + input
129+
```
130+
131+
## Replicate Layer
132+
The `Replicate` layer replicates a given model `n` times and applies each one sequentially.
133+
134+
```haskell
135+
-- Assume 'model' is an instance of some model
136+
let replicatedModel = Replicate n model
137+
let output = forward replicatedModel input
138+
-- output == forward model (forward model (... (forward model input) ...))
139+
-- where 'forward model' is applied n times
140+
```
141+
142+
## Merging Parameters
143+
144+
The `mergeParameters` function can be used to combine the parameters of two models. In the example below, only the last layer of the second model is added to the last layer of the first model.
145+
146+
```haskell
147+
m0 <- sample mlpSpec
148+
m1' <- sample mlpSpec
149+
let layer0 = getLastLayer m0
150+
let zero1 = over (types @Tensor) (ones' . shape) $ getLastLayer m1'
151+
let model' = addLastLayer (dropLastLayer m0) (mergeParameters (+) layer0 zero1)
152+
```
57153

0 commit comments

Comments
 (0)