@@ -41,31 +41,28 @@ The proposed model only consists of **sparsely connected layers** without any fu
41
41
## Training
42
42
For training, this implementation fixes the random seed to ` 12321 ` for ` reproducibility ` .
43
43
44
- - [x] Data Augmentation
45
- - [ ] Back-propagation
46
- - [ ] Mini batch Stochastic Gradient Descent with Momentum
47
- - [ ] Weight Initialization
48
- - [ ] Learning Rate
49
- - [ ] Early Stopping
50
-
44
+ The experimental conditions are same as in the paper, except for ` data augmentation ` and ` learning rate ` .
45
+ The ` learning rate ` is initialized with ` 1e-3 ` and decreased by a factor of 0.1 ** after 26 epochs** .
46
+ You can see the details in ` src/model/_base.py ` and ` experiments/config/AConvNet-SOC.json `
51
47
52
48
### Data Augmentation
53
- Source code is ` src/data/generate_dataset.py ` and ` src/data/mstar.py `
49
+
54
50
- The author uses random shifting to extract 88 x 88 patches from 128 x 128 SAR image chips.
55
51
- The number of training images per one SAR image chip could be increased at maximum by (128 - 88 + 1) x (128 - 88 + 1) = 1681.
56
52
57
53
- However, for SOC, this repository does not use random shifting tue to accuracy issue.
54
+ - You can see the details in ` src/data/generate_dataset.py ` and ` src/data/mstar.py `
58
55
- This implementation failed to achieve higher than 98% accuracy when using random sampling.
59
56
- The implementation details for data augmentation is as:
60
- - Crop the center of 94 x 94 size image on 128 x 128 SAR image chip.
57
+ - Crop the center of 94 x 94 size image on 128 x 128 SAR image chip (49 patches per image chip) .
61
58
- Extract 88 x 88 patches with stride 1 from 94 x 94 image.
62
59
63
60
64
61
## Experiments
65
62
66
- ### Standard Operating Condition (SOC )
63
+ You can download the MSTAR Dataset from [ MSTAR Overview ] ( https://www.sdms.afrl.af.mil/index.php?collection=mstar )
67
64
68
- You can download from [ MSTAR Overview ] ( https://www.sdms.afrl.af.mil/index.php?collection=mstar )
65
+ ### Standard Operating Condition (SOC )
69
66
70
67
- MSTAR Target Chips (T72 BMP2 BTR70 SLICY) which is ** MSTAR-PublicTargetChips-T72-BMP2-BTR70-SLICY.zip**
71
68
- MSTAR / IU Mixed Targets which consists of ** MSTAR-PublicMixedTargets-CD1.zip** and ** MSTAR-PublicMixedTargets-CD2.zip**
@@ -135,6 +132,18 @@ MSTAR-PublicMixedTargets-CD1/MSTAR_PUBLIC_MIXED_TARGETS_CD1
135
132
136
133
```
137
134
135
+ #### Results of SOC
136
+ - You can see the details in ` notebook/experiments-SOC.ipynb `
137
+
138
+ - Visualization of training loss and test accuracy
139
+
140
+ ![ soc-training-plot] ( ./assets/figure/soc-training-plot.png )
141
+
142
+ - Confusion Matrix with best model at ** epoch 28**
143
+
144
+ ![ soc-confusion-matrix] ( ./assets/figure/soc-confusion-matrix.png )
145
+
146
+
138
147
### Extended Operating Conditions (EOC)
139
148
140
149
### Outlier Rejection
@@ -156,28 +165,28 @@ MSTAR-PublicMixedTargets-CD1/MSTAR_PUBLIC_MIXED_TARGETS_CD1
156
165
}
157
166
```
158
167
168
+ ---
169
+
159
170
## TODO
160
171
161
172
- [ ] Implementation
162
173
- [ ] Data generation
163
- - [ ] SOC
174
+ - [ X ] SOC
164
175
- [ ] EOC
165
176
- [ ] Outlier Rejection
166
177
- [ ] End-to-End SAR-ATR
167
178
- [ ] Data Loader
168
- - [ ] SOC
179
+ - [X ] SOC
169
180
- [ ] EOC
170
181
- [ ] Outlier Rejection
171
182
- [ ] End-to-End SAR-ATR
172
183
- [ ] Model
173
- - [ ] Network
174
- - [ ] Training
175
- - [ ] Early Stopping
176
- - [ ] Hyper-parameter Optimization
184
+ - [X ] Network
185
+ - [X ] Training
186
+ - [X ] Early Stopping
187
+ - [X ] Hyper-parameter Optimization
177
188
- [ ] Experiments
178
- - [ ] Reproduce the SOC Results
179
- - [ ] 1 channel input (Magnitude only)
180
- - [ ] 2 channel input (Magnitude + Phase)
189
+ - [X] Reproduce the SOC Results
181
190
- [ ] Reproduce the EOC Results
182
191
- [ ] Reproduce the outlier rejection
183
192
- [ ] Reproduce the end-to-end SAR-ATR
0 commit comments