noether.core.schemas.models¶
Submodules¶
Classes¶
Internal base class for all registry-based configs. |
|
Internal base class for all registry-based configs. |
|
Configuration for a Transformer model. |
|
Configuration for a Transolver model. |
|
Configuration for a Transolver++ model. |
|
Configuration for a UPT model. |
Package Contents¶
- class noether.core.schemas.models.AnchorBranchedUPTConfig(/, **data)¶
Bases:
noether.core.schemas.models.base.ModelBaseConfig,noether.core.schemas.mixins.InjectSharedFieldFromParentMixinInternal base class for all registry-based configs.
Provides auto-registration via __init_subclass__. Not meant to be used directly - use specific config base classes instead.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- supernode_pooling_config: Annotated[noether.core.schemas.modules.encoders.SupernodePoolingConfig, noether.core.schemas.mixins.Shared] | None = None¶
- transformer_block_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]¶
Hidden dimension of the model.
- physics_blocks: list[Literal['self', 'shared', 'cross', 'joint', 'perceiver']]¶
Types of physics blocks to use in the model. Options are “self”, “cross”, “joint”, and “perceiver”. Self: Self-attention within a branch. Attention weights are shared between all domains. Cross: Cross-attention between domains. Each domain attends to all other domains’ anchors. Joint: Joint attention over all domain points. Full self-attention over all points. Perceiver: Perceiver-style cross-attention to geometry encoding.
Note: “shared” is a deprecated alias for “self” and will be removed in a future release.
- num_domain_decoder_blocks: dict[str, int]¶
2, “volume”: 2}.
- Type:
Number of final domain-specific decoder blocks with self attention and no weight sharing, e.g. {“surface”
- init_weights: noether.core.types.InitWeightsMode = None¶
Weight initialization of linear layers. Defaults to “truncnormal002”.
- data_specs: noether.core.schemas.dataset.ModelDataSpecs¶
Data specifications for the model.
Migrate deprecated ‘shared’ block type to ‘self’.
- Return type:
- set_condition_dim()¶
Set condition_dim in transformer_block_config based on data_specs.
- Return type:
- rope_frequency_config()¶
- pos_embed_config()¶
- bias_mlp_config()¶
- Return type:
- perceiver_block_config()¶
- domain_decoder_configs()¶
Per-domain decoder projection configs, keyed by domain name.
- Return type:
dict[str, noether.core.schemas.modules.layers.LinearProjectionConfig]
- validate_parameters()¶
Validate validity of parameters across the model and its submodules.
Ensures that hidden_dim is consistent across parent and all submodules. Note: transformer_block_config validates hidden_dim % num_heads == 0 in its own validator.
- Return type:
- class noether.core.schemas.models.ModelBaseConfig(/, **data)¶
Bases:
noether.core.schemas.lib._RegistryBaseInternal base class for all registry-based configs.
Provides auto-registration via __init_subclass__. Not meant to be used directly - use specific config base classes instead.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- optimizer_config: noether.core.schemas.optimizers.AnyOptimizerConfig | None = None¶
The optimizer configuration to use for training the model. When a model is used for inference only, this can be left as None.
- initializers: list[Annotated[noether.core.schemas.initializers.AnyInitializer, Field(discriminator='kind')]] | None = None¶
List of initializers configs to use for the model.
- forward_properties: list[str] | None = []¶
List of properties to be used as inputs for the forward pass of the model. Only relevant when the train_step of the BaseTrainer is used. When overridden in a class method, this property is ignored.
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class noether.core.schemas.models.TransformerConfig(/, **data)¶
Bases:
noether.core.schemas.models.base.ModelBaseConfig,noether.core.schemas.mixins.InjectSharedFieldFromParentMixinConfiguration for a Transformer model.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
Hidden dimension of the model. Used for all transformer blocks.
- transformer_block_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]¶
- class noether.core.schemas.models.TransolverConfig(/, **data)¶
Bases:
noether.core.schemas.models.transformer.TransformerConfig,noether.core.schemas.models.base.ModelBaseConfigConfiguration for a Transolver model.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- set_attention_constructor()¶
Set attention_constructor in transformer_block_config based on data_specs.
- Return type:
- class noether.core.schemas.models.TransolverPlusPlusConfig(/, **data)¶
Bases:
TransolverConfigConfiguration for a Transolver++ model.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- set_attention_constructor()¶
Set attention_constructor in transformer_block_config based on data_specs.
- Return type:
- class noether.core.schemas.models.UPTConfig(/, **data)¶
Bases:
noether.core.schemas.models.base.ModelBaseConfig,noether.core.schemas.mixins.InjectSharedFieldFromParentMixinConfiguration for a UPT model.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
Hidden dimension of the model.
- supernode_pooling_config: Annotated[noether.core.schemas.modules.SupernodePoolingConfig, noether.core.schemas.mixins.Shared]¶
- approximator_config: Annotated[noether.core.schemas.modules.blocks.TransformerBlockConfig, noether.core.schemas.mixins.Shared]¶
- decoder_config: Annotated[noether.core.schemas.modules.DeepPerceiverDecoderConfig, noether.core.schemas.mixins.Shared]¶
- data_specs: noether.core.schemas.dataset.ModelDataSpecs¶
- linear_output_projection_config()¶
- rope_frequency_config()¶
- validate_rope_usage()¶
Ensure that if use_rope is True in the main config, it is also True in the approximator_config.
- Return type:
- pos_embedding_config()¶