noether.core.schemas.models

Submodules

Classes

AnchorBranchedUPTConfig

ModelBaseConfig

TransformerConfig

Configuration for a Transformer model.

TransolverConfig

Configuration for a Transolver model.

TransolverPlusPlusConfig

Configuration for a Transolver++ model.

UPTConfig

Configuration for a UPT model.

Package Contents

class noether.core.schemas.models.AnchorBranchedUPTConfig(/, **data)

Bases: noether.core.schemas.models.base.ModelBaseConfig, noether.core.schemas.mixins.InjectSharedFieldFromParentMixin

Parameters:

data (Any)

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

supernode_pooling_config: Annotated[noether.core.schemas.modules.encoders.SupernodePoolingConfig, noether.core.schemas.mixins.Shared]
transformer_block_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]
geometry_depth: int = None

Number of transformer blocks in the geometry encoder.

hidden_dim: int = None

Hidden dimension of the model.

physics_blocks: list[Literal['shared', 'cross', 'joint', 'perceiver']]

Types of physics blocks to use in the model. Options are “shared”, “cross”, “joint”, and “perceiver”. Shared: Self-attention within a branch (surface or volume). Attention blocks share weights between surface and volume. Cross: Cross-attention between surface and volume branches. Weights are shared between surface and volume. Joint: Joint attention over surface and volume points. I.e. full self-attention over both surface and volume points. Perceiver: Perceiver-style attention blocks.

num_surface_blocks: int = None

Number of transformer blocks in the surface decoder. Weights are not shared with the volume decoder.

num_volume_blocks: int = None

Number of transformer blocks in the volume decoder. Weights are not shared with the surface decoder.

init_weights: noether.core.types.InitWeightsMode = None

Weight initialization of linear layers. Defaults to “truncnormal002”.

drop_path_rate: float = None

Drop path rate for stochastic depth. Defaults to 0.0 (no drop path).

data_specs: noether.core.schemas.dataset.AeroDataSpecs

Data specifications for the model.

set_condition_dim()

Set condition_dim in transformer_block_config based on data_specs.

Return type:

AnchorBranchedUPTConfig

rope_frequency_config()
Return type:

noether.core.schemas.modules.layers.RopeFrequencyConfig

pos_embed_config()
Return type:

noether.core.schemas.modules.layers.ContinuousSincosEmbeddingConfig

bias_mlp_config()
Return type:

noether.core.schemas.modules.mlp.MLPConfig

perceiver_block_config()
Return type:

noether.core.schemas.modules.blocks.PerceiverBlockConfig

surface_decoder_config()
Return type:

noether.core.schemas.modules.layers.LinearProjectionConfig

volume_decoder_config()
Return type:

noether.core.schemas.modules.layers.LinearProjectionConfig | None

validate_parameters()

Validate validity of parameters across the model and its submodules.

Ensures that hidden_dim is consistent across parent and all submodules. Note: transformer_block_config validates hidden_dim % num_heads == 0 in its own validator.

Return type:

AnchorBranchedUPTConfig

class noether.core.schemas.models.ModelBaseConfig(/, **data)

Bases: pydantic.BaseModel

Parameters:

data (Any)

kind: str

Kind of model to use, i.e. class path

name: str

Name of the model. Needs to be unique

optimizer_config: noether.core.schemas.optimizers.OptimizerConfig | None = None

The optimizer configuration to use for training the model. When a model is used for inference only, this can be left as None.

initializers: list[Annotated[noether.core.schemas.initializers.AnyInitializer, Field(discriminator='kind')]] | None = None

List of initializers configs to use for the model.

is_frozen: bool | None = False

Whether to freeze the model parameters (i.e., not trainable).

forward_properties: list[str] | None = []

List of properties to be used as inputs for the forward pass of the model. Only relevant when the train_step of the BaseTrainer is used. When overridden in a class method, this property is ignored.

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

property config_kind: str

The fully qualified import path for the configuration class.

Return type:

str

class noether.core.schemas.models.TransformerConfig(/, **data)

Bases: noether.core.schemas.models.base.ModelBaseConfig, noether.core.schemas.mixins.InjectSharedFieldFromParentMixin

Configuration for a Transformer model.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

hidden_dim: int = None

Hidden dimension of the model. Used for all transformer blocks.

depth: int = None

Number of transformer blocks in the model.

transformer_block_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]
class noether.core.schemas.models.TransolverConfig(/, **data)

Bases: noether.core.schemas.models.transformer.TransformerConfig, noether.core.schemas.models.base.ModelBaseConfig

Configuration for a Transolver model.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

attention_constructor: Literal['transolver', 'transolver_plusplus'] = 'transolver'
attention_arguments: dict
class noether.core.schemas.models.TransolverPlusPlusConfig(/, **data)

Bases: TransolverConfig

Configuration for a Transolver++ model.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

attention_constructor: Literal['transolver_plusplus'] = 'transolver_plusplus'
class noether.core.schemas.models.UPTConfig(/, **data)

Bases: noether.core.schemas.models.base.ModelBaseConfig, noether.core.schemas.mixins.InjectSharedFieldFromParentMixin

Configuration for a UPT model.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

num_heads: int = None

Number of attention heads in the model.

hidden_dim: int = None

Hidden dimension of the model.

mlp_expansion_factor: int = None

Expansion factor for the MLP of the FF layers.

approximator_depth: int = None

Number of approximator layers.

use_rope: bool = None
supernode_pooling_config: Annotated[noether.core.schemas.modules.SupernodePoolingConfig, noether.core.schemas.mixins.Shared]
approximator_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]
decoder_config: Annotated[noether.core.schemas.modules.DeepPerceiverDecoderConfig, noether.core.schemas.mixins.Shared]
bias_layers: bool = None
data_specs: noether.core.schemas.dataset.AeroDataSpecs
linear_output_projection_config()
Return type:

noether.core.schemas.modules.layers.LinearProjectionConfig

rope_frequency_config()
Return type:

noether.core.schemas.modules.layers.RopeFrequencyConfig

validate_rope_usage()

Ensure that if use_rope is True in the main config, it is also True in the approximator_config.

Return type:

UPTConfig

update_supernode_pooling_config()

Inject shared fields into supernode_pooling_config.

Return type:

UPTConfig

pos_embedding_config()
Return type:

noether.core.schemas.modules.layers.ContinuousSincosEmbeddingConfig

validate_parameters()

Validate validity of parameters across the model and its submodules.

Ensures that: 1. hidden_dim is divisible by num_heads in parent and all submodules with num_heads 2. hidden_dim is consistent across parent and all submodules

Return type:

UPTConfig