noether.core.schemas.models

Submodules

Classes

AnchorBranchedUPTConfig

Internal base class for all registry-based configs.

ModelBaseConfig

Internal base class for all registry-based configs.

TransformerConfig

Configuration for a Transformer model.

TransolverConfig

Configuration for a Transolver model.

TransolverPlusPlusConfig

Configuration for a Transolver++ model.

UPTConfig

Configuration for a UPT model.

Package Contents

class noether.core.schemas.models.AnchorBranchedUPTConfig(/, **data)

Bases: noether.core.schemas.models.base.ModelBaseConfig, noether.core.schemas.mixins.InjectSharedFieldFromParentMixin

Internal base class for all registry-based configs.

Provides auto-registration via __init_subclass__. Not meant to be used directly - use specific config base classes instead.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

supernode_pooling_config: Annotated[noether.core.schemas.modules.encoders.SupernodePoolingConfig, noether.core.schemas.mixins.Shared] | None = None
transformer_block_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]
geometry_depth: int = None

Number of transformer blocks in the geometry encoder.

hidden_dim: int = None

Hidden dimension of the model.

physics_blocks: list[Literal['self', 'shared', 'cross', 'joint', 'perceiver']]

Types of physics blocks to use in the model. Options are “self”, “cross”, “joint”, and “perceiver”. Self: Self-attention within a branch. Attention weights are shared between all domains. Cross: Cross-attention between domains. Each domain attends to all other domains’ anchors. Joint: Joint attention over all domain points. Full self-attention over all points. Perceiver: Perceiver-style cross-attention to geometry encoding.

Note: “shared” is a deprecated alias for “self” and will be removed in a future release.

num_domain_decoder_blocks: dict[str, int]

2, “volume”: 2}.

Type:

Number of final domain-specific decoder blocks with self attention and no weight sharing, e.g. {“surface”

init_weights: noether.core.types.InitWeightsMode = None

Weight initialization of linear layers. Defaults to “truncnormal002”.

drop_path_rate: float = None

Drop path rate for stochastic depth. Defaults to 0.0 (no drop path).

data_specs: noether.core.schemas.dataset.ModelDataSpecs

Data specifications for the model.

migrate_shared_to_self()

Migrate deprecated ‘shared’ block type to ‘self’.

Return type:

AnchorBranchedUPTConfig

set_condition_dim()

Set condition_dim in transformer_block_config based on data_specs.

Return type:

AnchorBranchedUPTConfig

rope_frequency_config()
Return type:

noether.core.schemas.modules.layers.RopeFrequencyConfig

pos_embed_config()
Return type:

noether.core.schemas.modules.layers.ContinuousSincosEmbeddingConfig

bias_mlp_config()
Return type:

noether.core.schemas.modules.mlp.MLPConfig

perceiver_block_config()
Return type:

noether.core.schemas.modules.blocks.PerceiverBlockConfig

domain_decoder_configs()

Per-domain decoder projection configs, keyed by domain name.

Return type:

dict[str, noether.core.schemas.modules.layers.LinearProjectionConfig]

validate_parameters()

Validate validity of parameters across the model and its submodules.

Ensures that hidden_dim is consistent across parent and all submodules. Note: transformer_block_config validates hidden_dim % num_heads == 0 in its own validator.

Return type:

AnchorBranchedUPTConfig

class noether.core.schemas.models.ModelBaseConfig(/, **data)

Bases: noether.core.schemas.lib._RegistryBase

Internal base class for all registry-based configs.

Provides auto-registration via __init_subclass__. Not meant to be used directly - use specific config base classes instead.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

kind: str | None = None

Kind of model to use, i.e. class path

name: str

Name of the model. Needs to be unique

optimizer_config: noether.core.schemas.optimizers.AnyOptimizerConfig | None = None

The optimizer configuration to use for training the model. When a model is used for inference only, this can be left as None.

initializers: list[Annotated[noether.core.schemas.initializers.AnyInitializer, Field(discriminator='kind')]] | None = None

List of initializers configs to use for the model.

is_frozen: bool | None = False

Whether to freeze the model parameters (i.e., not trainable).

forward_properties: list[str] | None = []

List of properties to be used as inputs for the forward pass of the model. Only relevant when the train_step of the BaseTrainer is used. When overridden in a class method, this property is ignored.

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

property config_kind: str

The fully qualified import path for the configuration class.

Return type:

str

class noether.core.schemas.models.TransformerConfig(/, **data)

Bases: noether.core.schemas.models.base.ModelBaseConfig, noether.core.schemas.mixins.InjectSharedFieldFromParentMixin

Configuration for a Transformer model.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

hidden_dim: int = None

Hidden dimension of the model. Used for all transformer blocks.

depth: int = None

Number of transformer blocks in the model.

transformer_block_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]
class noether.core.schemas.models.TransolverConfig(/, **data)

Bases: noether.core.schemas.models.transformer.TransformerConfig, noether.core.schemas.models.base.ModelBaseConfig

Configuration for a Transolver model.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

attention_arguments: dict
set_attention_constructor()

Set attention_constructor in transformer_block_config based on data_specs.

Return type:

TransolverConfig

class noether.core.schemas.models.TransolverPlusPlusConfig(/, **data)

Bases: TransolverConfig

Configuration for a Transolver++ model.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

set_attention_constructor()

Set attention_constructor in transformer_block_config based on data_specs.

Return type:

TransolverPlusPlusConfig

class noether.core.schemas.models.UPTConfig(/, **data)

Bases: noether.core.schemas.models.base.ModelBaseConfig, noether.core.schemas.mixins.InjectSharedFieldFromParentMixin

Configuration for a UPT model.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

model_config

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

num_heads: int = None

Number of attention heads in the model.

hidden_dim: int = None

Hidden dimension of the model.

mlp_expansion_factor: int = None

Expansion factor for the MLP of the FF layers.

approximator_depth: int = None

Number of approximator layers.

use_rope: bool = None
supernode_pooling_config: Annotated[noether.core.schemas.modules.SupernodePoolingConfig, noether.core.schemas.mixins.Shared]
approximator_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]
decoder_config: Annotated[noether.core.schemas.modules.DeepPerceiverDecoderConfig, noether.core.schemas.mixins.Shared]
bias_layers: bool = None
data_specs: noether.core.schemas.dataset.ModelDataSpecs
linear_output_projection_config()
Return type:

noether.core.schemas.modules.layers.LinearProjectionConfig

rope_frequency_config()
Return type:

noether.core.schemas.modules.layers.RopeFrequencyConfig

validate_rope_usage()

Ensure that if use_rope is True in the main config, it is also True in the approximator_config.

Return type:

UPTConfig

pos_embedding_config()
Return type:

noether.core.schemas.modules.layers.ContinuousSincosEmbeddingConfig

validate_parameters()

Validate validity of parameters across the model and its submodules.

Ensures that: 1. hidden_dim is divisible by num_heads in parent and all submodules with num_heads 2. hidden_dim is consistent across parent and all submodules

Return type:

UPTConfig