noether.core.schemas.models¶
Submodules¶
Classes¶
Configuration for a Transformer model. |
|
Configuration for a Transolver model. |
|
Configuration for a Transolver++ model. |
|
Configuration for a UPT model. |
Package Contents¶
- class noether.core.schemas.models.AnchorBranchedUPTConfig(/, **data)¶
Bases:
noether.core.schemas.models.base.ModelBaseConfig,noether.core.schemas.mixins.InjectSharedFieldFromParentMixin- Parameters:
data (Any)
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- supernode_pooling_config: Annotated[noether.core.schemas.modules.encoders.SupernodePoolingConfig, noether.core.schemas.mixins.Shared]¶
- transformer_block_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]¶
Hidden dimension of the model.
- physics_blocks: list[Literal['shared', 'cross', 'joint', 'perceiver']]¶
Types of physics blocks to use in the model. Options are “shared”, “cross”, “joint”, and “perceiver”. Shared: Self-attention within a branch (surface or volume). Attention blocks share weights between surface and volume. Cross: Cross-attention between surface and volume branches. Weights are shared between surface and volume. Joint: Joint attention over surface and volume points. I.e. full self-attention over both surface and volume points. Perceiver: Perceiver-style attention blocks.
- num_surface_blocks: int = None¶
Number of transformer blocks in the surface decoder. Weights are not shared with the volume decoder.
- num_volume_blocks: int = None¶
Number of transformer blocks in the volume decoder. Weights are not shared with the surface decoder.
- init_weights: noether.core.types.InitWeightsMode = None¶
Weight initialization of linear layers. Defaults to “truncnormal002”.
- data_specs: noether.core.schemas.dataset.AeroDataSpecs¶
Data specifications for the model.
- set_condition_dim()¶
Set condition_dim in transformer_block_config based on data_specs.
- Return type:
- rope_frequency_config()¶
- pos_embed_config()¶
- bias_mlp_config()¶
- Return type:
- perceiver_block_config()¶
- surface_decoder_config()¶
- volume_decoder_config()¶
- Return type:
noether.core.schemas.modules.layers.LinearProjectionConfig | None
- validate_parameters()¶
Validate validity of parameters across the model and its submodules.
Ensures that hidden_dim is consistent across parent and all submodules. Note: transformer_block_config validates hidden_dim % num_heads == 0 in its own validator.
- Return type:
- class noether.core.schemas.models.ModelBaseConfig(/, **data)¶
Bases:
pydantic.BaseModel- Parameters:
data (Any)
- optimizer_config: noether.core.schemas.optimizers.OptimizerConfig | None = None¶
The optimizer configuration to use for training the model. When a model is used for inference only, this can be left as None.
- initializers: list[Annotated[noether.core.schemas.initializers.AnyInitializer, Field(discriminator='kind')]] | None = None¶
List of initializers configs to use for the model.
- forward_properties: list[str] | None = []¶
List of properties to be used as inputs for the forward pass of the model. Only relevant when the train_step of the BaseTrainer is used. When overridden in a class method, this property is ignored.
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class noether.core.schemas.models.TransformerConfig(/, **data)¶
Bases:
noether.core.schemas.models.base.ModelBaseConfig,noether.core.schemas.mixins.InjectSharedFieldFromParentMixinConfiguration for a Transformer model.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
Hidden dimension of the model. Used for all transformer blocks.
- transformer_block_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]¶
- class noether.core.schemas.models.TransolverConfig(/, **data)¶
Bases:
noether.core.schemas.models.transformer.TransformerConfig,noether.core.schemas.models.base.ModelBaseConfigConfiguration for a Transolver model.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- attention_constructor: Literal['transolver', 'transolver_plusplus'] = 'transolver'¶
- class noether.core.schemas.models.TransolverPlusPlusConfig(/, **data)¶
Bases:
TransolverConfigConfiguration for a Transolver++ model.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- attention_constructor: Literal['transolver_plusplus'] = 'transolver_plusplus'¶
- class noether.core.schemas.models.UPTConfig(/, **data)¶
Bases:
noether.core.schemas.models.base.ModelBaseConfig,noether.core.schemas.mixins.InjectSharedFieldFromParentMixinConfiguration for a UPT model.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
Hidden dimension of the model.
- supernode_pooling_config: Annotated[noether.core.schemas.modules.SupernodePoolingConfig, noether.core.schemas.mixins.Shared]¶
- approximator_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]¶
- decoder_config: Annotated[noether.core.schemas.modules.DeepPerceiverDecoderConfig, noether.core.schemas.mixins.Shared]¶
- data_specs: noether.core.schemas.dataset.AeroDataSpecs¶
- linear_output_projection_config()¶
- rope_frequency_config()¶
- validate_rope_usage()¶
Ensure that if use_rope is True in the main config, it is also True in the approximator_config.
- Return type:
- update_supernode_pooling_config()¶
Inject shared fields into supernode_pooling_config.
- Return type:
- pos_embedding_config()¶