MultiHeadAttention¶
- class mmocr.models.common.MultiHeadAttention(n_head=8, d_model=512, d_k=64, d_v=64, dropout=0.1, qkv_bias=False)[source]¶
Multi-Head Attention module.
- Parameters
n_head (int) – The number of heads in the multiheadattention models (default=8).
d_model (int) – The number of expected features in the decoder inputs (default=512).
d_k (int) – Total number of features in key.
d_v (int) – Total number of features in value.
dropout (float) – Dropout layer on attn_output_weights.
qkv_bias (bool) – Add bias in projection layer. Default: False.
- forward(q, k, v, mask=None)[source]¶
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.