noether.core.schemas.models.ab_upt¶
Classes¶
Internal base class for all registry-based configs. |
Module Contents¶
- class noether.core.schemas.models.ab_upt.AnchorBranchedUPTConfig(/, **data)¶
Bases:
noether.core.schemas.models.base.ModelBaseConfig,noether.core.schemas.mixins.InjectSharedFieldFromParentMixinInternal base class for all registry-based configs.
Provides auto-registration via __init_subclass__. Not meant to be used directly - use specific config base classes instead.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- supernode_pooling_config: Annotated[noether.core.schemas.modules.encoders.SupernodePoolingConfig, noether.core.schemas.mixins.Shared] | None = None¶
- transformer_block_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]¶
Hidden dimension of the model.
- physics_blocks: list[Literal['self', 'shared', 'cross', 'joint', 'perceiver']]¶
Types of physics blocks to use in the model. Options are “self”, “cross”, “joint”, and “perceiver”. Self: Self-attention within a branch. Attention weights are shared between all domains. Cross: Cross-attention between domains. Each domain attends to all other domains’ anchors. Joint: Joint attention over all domain points. Full self-attention over all points. Perceiver: Perceiver-style cross-attention to geometry encoding.
Note: “shared” is a deprecated alias for “self” and will be removed in a future release.
- num_domain_decoder_blocks: dict[str, int]¶
2, “volume”: 2}.
- Type:
Number of final domain-specific decoder blocks with self attention and no weight sharing, e.g. {“surface”
- init_weights: noether.core.types.InitWeightsMode = None¶
Weight initialization of linear layers. Defaults to “truncnormal002”.
- data_specs: noether.core.schemas.dataset.ModelDataSpecs¶
Data specifications for the model.
Migrate deprecated ‘shared’ block type to ‘self’.
- Return type:
- set_condition_dim()¶
Set condition_dim in transformer_block_config based on data_specs.
- Return type:
- rope_frequency_config()¶
- pos_embed_config()¶
- bias_mlp_config()¶
- Return type:
- perceiver_block_config()¶
- domain_decoder_configs()¶
Per-domain decoder projection configs, keyed by domain name.
- Return type:
dict[str, noether.core.schemas.modules.layers.LinearProjectionConfig]
- validate_parameters()¶
Validate validity of parameters across the model and its submodules.
Ensures that hidden_dim is consistent across parent and all submodules. Note: transformer_block_config validates hidden_dim % num_heads == 0 in its own validator.
- Return type: