You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You're right and they do come before ReLUs in the conv layers. There are normally no batchnorms in the classifier but since the paper says they replaced dropout with BN I must have find-replaced all dropout layers with batchnorm.
I came across this Reddit thread and now I'm not sure where BatchNorm belongs anymore.
https://www.youtube.com/watch?v=tNIpEZLv_eg&t=328s
Watch this video from Andrew Ng. He mentions in third minute he talks about that and mentions that normalizing the values before applying activation function is much more common.
No description provided.
The text was updated successfully, but these errors were encountered: