Skip to content

Incorrectly implemented MultiHeadAttention in PseTae #40

@sbgeophd

Description

@sbgeophd

The implementation of the PseTae MultiHeadAttention appears to have a mistake.

The original implementation applies 2 fully connected layers on the query tensor (see here: https://github.com/VSainteuf/pytorch-psetae/blob/master/models/tae.py#L133)

However, in the eo-flow implementation, while both fully connected layers are defined, the 2nd (defined here: https://github.com/sentinel-hub/eo-flow/blob/master/eoflow/models/pse_tae_layers.py#L50) is not used as would be expected here: https://github.com/sentinel-hub/eo-flow/blob/master/eoflow/models/pse_tae_layers.py#L66 (indeed, it is not used at all in the code).

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions