noether.modeling.modules.attention.anchor_attention.self_anchor¶
Classes¶
Anchor attention within branches: each configured branch attends to its own anchors independently. |
Module Contents¶
- class noether.modeling.modules.attention.anchor_attention.self_anchor.SelfAnchorAttention(config)¶
Bases:
noether.modeling.modules.attention.anchor_attention.multi_branch.MultiBranchAnchorAttentionAnchor attention within branches: each configured branch attends to its own anchors independently.
For a list of branches (e.g., A, B, C), this creates a pattern where A tokens attend to A_anchors, B tokens attend to B_anchors, and C tokens attend to C_anchors. It requires all configured branches and their anchors to be present in the input.
Example: surface tokens attend to surface_anchors and volume tokens attend to volume_anchors. This is achieved via the following attention patterns:
AttentionPattern(query_tokens=[“surface_anchors”, “surface_queries”], key_value_tokens=[“surface_anchors”]) AttentionPattern(query_tokens=[“volume_anchors”, “volume_queries”], key_value_tokens=[“volume_anchors”])
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- Parameters:
config (noether.core.schemas.modules.attention.AttentionConfig)