Skip to content

Commit c68dfb9

Browse files
committed
Update bayesian segnet
1 parent ac256f1 commit c68dfb9

File tree

1 file changed

+10
-4
lines changed

1 file changed

+10
-4
lines changed

paper_notes/bayesian_segnet.md

+10-4
Original file line numberDiff line numberDiff line change
@@ -2,16 +2,22 @@
22

33
_June 2019_
44

5-
tl;dr: Extension of SegNet and output uncertainty map alongside the segmentation map.
5+
tl;dr: Estimate the variance of segmentation uncertainty with dropout inference samples. Use the mean for prediction. The idea is quite similar to TTA (test time augmentation).
66

77
#### Overall impression
8-
Describe the overall impression of the paper.
8+
The paper provides a practical way to evaluate the uncertainty (this is the epistemic uncertainty), at a cost at inference time. Refer to [Bayesian DL](uncertainty_bdl.md) for integration with aleatoric uncertainty.
99

1010
#### Key ideas
11-
- Summaries of the key ideas
11+
- Sampling with dropout performs better than weight averaging (normal dropout behavior during eval). Sampling dropout performs better than weight averaging after approximation with 6 samples. The performance saturates with 40 samples.
12+
- This comes at inference time cost, but is naively parallelizable.
13+
- The results also show that when the model predicts an incorrect label the model uncertainty is very high.
14+
- Class boundaries usually display high level of uncertainty.
15+
- Objects that are occluded or at a distance from the camera are are uncertain.
16+
- The uncertainty score is inversely proportional to occurrence and accuracy. The model is more confident about classes which are easier and occur more often.
17+
- The accuracy improves when we use a tighter threshold to filter out non-confident results. Uncertainty is an effective measure of accuracy.
1218

1319
#### Technical details
14-
- Summary of technical details
20+
- **No need to use dropout layer after every layer.** Get the optimal architecture first by test placing dropout in different places. Then keep using dropout during inference (variational inference).
1521

1622
#### Notes
1723
- [poster](https://alexgkendall.com/media/presentations/bmvc17_bayesian_segnet_poster.pdf)

0 commit comments

Comments
 (0)