noether.modeling.modules.attention.anchor_attention.joint ========================================================= .. py:module:: noether.modeling.modules.attention.anchor_attention.joint Classes ------- .. autoapisummary:: noether.modeling.modules.attention.anchor_attention.joint.JointAnchorAttention Module Contents --------------- .. py:class:: JointAnchorAttention(config) Bases: :py:obj:`noether.modeling.modules.attention.anchor_attention.multi_branch.MultiBranchAnchorAttention` Anchor attention within and across branches: all tokens attend to anchors from all configured branches. For a list of branches (e.g., A, B, C), this creates a pattern where all tokens (A_anchors, A_queries, B_anchors, B_queries, C_anchors, C_queries) attend to (A_anchors + B_anchors + C_anchors). It requires at least one anchor token to be present in the input. Example: all tokens attend to (surface_anchors, volume_anchors). This is achieved via the following attention pattern: AttentionPattern( query_tokens=["surface_anchors", "surface_queries", "volume_anchors", "volume_queries"], key_value_tokens=["surface_anchors", "volume_anchors"] ) Initialize internal Module state, shared by both nn.Module and ScriptModule.