noether.core.schemas.models.ab_upt¶
Classes¶
Module Contents¶
- class noether.core.schemas.models.ab_upt.AnchorBranchedUPTConfig(/, **data)¶
Bases:
noether.core.schemas.models.base.ModelBaseConfig,noether.core.schemas.mixins.InjectSharedFieldFromParentMixin- Parameters:
data (Any)
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- supernode_pooling_config: Annotated[noether.core.schemas.modules.encoders.SupernodePoolingConfig, noether.core.schemas.mixins.Shared]¶
- transformer_block_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]¶
Hidden dimension of the model.
- physics_blocks: list[Literal['shared', 'cross', 'joint', 'perceiver']]¶
Types of physics blocks to use in the model. Options are “shared”, “cross”, “joint”, and “perceiver”. Shared: Self-attention within a branch (surface or volume). Attention blocks share weights between surface and volume. Cross: Cross-attention between surface and volume branches. Weights are shared between surface and volume. Joint: Joint attention over surface and volume points. I.e. full self-attention over both surface and volume points. Perceiver: Perceiver-style attention blocks.
- num_surface_blocks: int = None¶
Number of transformer blocks in the surface decoder. Weights are not shared with the volume decoder.
- num_volume_blocks: int = None¶
Number of transformer blocks in the volume decoder. Weights are not shared with the surface decoder.
- init_weights: noether.core.types.InitWeightsMode = None¶
Weight initialization of linear layers. Defaults to “truncnormal002”.
- data_specs: noether.core.schemas.dataset.AeroDataSpecs¶
Data specifications for the model.
- set_condition_dim()¶
Set condition_dim in transformer_block_config based on data_specs.
- Return type:
- rope_frequency_config()¶
- pos_embed_config()¶
- bias_mlp_config()¶
- Return type:
- perceiver_block_config()¶
- surface_decoder_config()¶
- volume_decoder_config()¶
- Return type:
noether.core.schemas.modules.layers.LinearProjectionConfig | None
- validate_parameters()¶
Validate validity of parameters across the model and its submodules.
Ensures that hidden_dim is consistent across parent and all submodules. Note: transformer_block_config validates hidden_dim % num_heads == 0 in its own validator.
- Return type: