You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Yes, you are correct, the second connected layer is not used. I can't recall if that was done on purpose or not, but I'd guess we just overlooked this part.
Do you have the capacity to make a pull request fixing this issue? If not, we could do it but it might not happen very soon.
I'm in theory happy to make a pull request, but while I think I've fixed it, the model still doesn't work for me. I'll submit a pull request if/when I get it working.
The implementation of the PseTae MultiHeadAttention appears to have a mistake.
The original implementation applies 2 fully connected layers on the query tensor (see here: https://github.com/VSainteuf/pytorch-psetae/blob/master/models/tae.py#L133)
However, in the eo-flow implementation, while both fully connected layers are defined, the 2nd (defined here: https://github.com/sentinel-hub/eo-flow/blob/master/eoflow/models/pse_tae_layers.py#L50) is not used as would be expected here: https://github.com/sentinel-hub/eo-flow/blob/master/eoflow/models/pse_tae_layers.py#L66 (indeed, it is not used at all in the code).
The text was updated successfully, but these errors were encountered: