noether.modeling.models¶
Submodules¶
Classes¶
Implementation of the Anchored Branched UPT model. Including input embedding and output projection, so this is an off-the-shelf model that can be used directly by providing the appropriate input tensors. |
|
Aerodynamic Anchored-Branched UPT wrapper. |
|
Aerodynamic Transformer wrapper. |
|
Transformer config extended with aerodynamic data specifications. |
|
Aerodynamic Transolver wrapper. |
|
Transolver config extended with aerodynamic data specifications. |
|
Aerodynamic UPT wrapper. |
|
Implementation of a Transformer model. |
|
Implementation of the UPT (Universal Physics Transformer) model. |
Package Contents¶
- class noether.modeling.models.AnchoredBranchedUPT(config)¶
Bases:
torch.nn.ModuleImplementation of the Anchored Branched UPT model. Including input embedding and output projection, so this is an off-the-shelf model that can be used directly by providing the appropriate input tensors.
- Parameters:
config (noether.core.schemas.models.AnchorBranchedUPTConfig) – Configuration for the AB-UPT model. See
AnchorBranchedUPTConfigfor details.
- data_specs¶
- rope¶
- pos_embed¶
- domain_biases¶
- physics_blocks¶
- use_geometry_branch = False¶
- domain_decoder_blocks¶
- domain_decoder_projections¶
- geometry_branch_forward(geometry_position, geometry_supernode_idx, geometry_batch_idx, condition, geometry_attn_kwargs)¶
Forward pass through the geometry branch of the model.
- Parameters:
geometry_position (torch.Tensor)
geometry_supernode_idx (torch.Tensor)
geometry_batch_idx (torch.Tensor)
condition (torch.Tensor | None)
geometry_attn_kwargs (dict[str, torch.Tensor])
- Return type:
- physics_blocks_forward(domain_positions_all, geometry_encoding, physics_token_specs, physics_attn_kwargs, physics_perceiver_attn_kwargs, condition, kv_cache=None)¶
Forward pass through the physics blocks of the model.
- Parameters:
domain_positions_all (dict[str, torch.Tensor])
geometry_encoding (torch.Tensor | None)
physics_token_specs (list[noether.core.schemas.modules.attention.TokenSpec])
condition (torch.Tensor | None)
kv_cache (ModelKVCache | None)
- Return type:
tuple[torch.Tensor, list[LayerCache]]
- decoder_blocks_forward(x_physics, physics_token_specs, per_domain_token_specs, decoder_attn_kwargs, condition, kv_cache=None, domain_positions_all=None)¶
Forward pass through the per-domain decoder blocks.
- Returns:
Tuple of (domain_predictions, new_domain_caches).
- Parameters:
x_physics (torch.Tensor)
physics_token_specs (list[noether.core.schemas.modules.attention.TokenSpec])
per_domain_token_specs (dict[str, list[noether.core.schemas.modules.attention.TokenSpec]])
condition (torch.Tensor | None)
kv_cache (ModelKVCache | None)
domain_positions_all (dict[str, torch.Tensor] | None)
- Return type:
- create_rope_frequencies(domain_positions_all, geometry_position=None, geometry_supernode_idx=None)¶
Create RoPE frequencies for all relevant positions.
- Returns:
Tuple of (geometry_attn_kwargs, decoder_attn_kwargs, physics_perceiver_attn_kwargs, physics_attn_kwargs). decoder_attn_kwargs is keyed by domain name.
- Parameters:
domain_positions_all (dict[str, torch.Tensor])
geometry_position (torch.Tensor | None)
geometry_supernode_idx (torch.Tensor | None)
- Return type:
tuple[dict[str, Any], dict[str, dict[str, Any]], dict[str, Any], dict[str, Any]]
- forward(geometry_position=None, geometry_supernode_idx=None, geometry_batch_idx=None, domain_anchor_positions=None, domain_query_positions=None, domain_features=None, conditioning_inputs=None, kv_cache=None)¶
Forward pass of the AB-UPT model.
Example:
model( geometry_position=..., geometry_supernode_idx=..., geometry_batch_idx=..., domain_anchor_positions={"surface": surface_pos, "volume": volume_pos}, domain_query_positions={"surface": query_pos}, conditioning_inputs={"geometry_design_parameters": design_params}, )
- Parameters:
geometry_position (torch.Tensor | None) – Coordinates of the geometry mesh. Tensor of shape (B * N_geometry, D_pos).
geometry_supernode_idx (torch.Tensor | None) – Supernode indices for the geometry points.
geometry_batch_idx (torch.Tensor | None) – Batch indices for the geometry points.
domain_anchor_positions (dict[str, torch.Tensor] | None) – Per-domain anchor positions, e.g.
{"surface": (B, N, D), "volume": (B, M, D)}.domain_query_positions (dict[str, torch.Tensor] | None) – Per-domain query positions (optional).
conditioning_inputs (dict[str, torch.Tensor] | None) – Conditioning tensors, e.g.
{"geometry_design_parameters": (B, D)}.kv_cache (ModelKVCache | None) – KV cache from a previous forward call.
domain_features (dict[str, torch.Tensor] | None)
- Returns:
Tuple of (predictions, kv_cache).
- Return type:
tuple[dict[str, torch.Tensor], ModelKVCache]
- class noether.modeling.models.AeroABUPT(model_config, **kwargs)¶
Bases:
noether.core.models.ModelAerodynamic Anchored-Branched UPT wrapper.
Bridges the factory’s
(config, **kwargs)instantiation pattern to the core model. Converts flat kwargs (surface_anchor_position,volume_anchor_position, …) into the domain-dict format expected byAnchoredBranchedUPT.Base class for single models, i.e. one model with one optimizer as opposed to CompositeModel.
- Parameters:
model_config (noether.core.schemas.models.AnchorBranchedUPTConfig) – Model configuration. See
ModelBaseConfigfor available options.update_counter – The
UpdateCounterprovided to the optimizer.is_frozen – If true, will set requires_grad of all parameters to false. Will also put the model into eval mode (e.g., to put Dropout or BatchNorm into eval mode).
path_provider –
PathProviderused by the initializer to store or retrieve checkpoints.data_container –
DataContainerwhich includes the data and dataloader. This is currently unused but helpful for quick prototyping only, evaluating forward in debug mode, etc.
- backbone¶
- forward(**kwargs)¶
- Return type:
- class noether.modeling.models.AeroTransformer(model_config, **kwargs)¶
Bases:
noether.core.models.ModelAerodynamic Transformer wrapper.
End-to-end forward for aero CFD: positional encoding, optional RoPE, optional physics features, surface/volume bias, Transformer backbone, output projection, and output gathering.
Base class for single models, i.e. one model with one optimizer as opposed to CompositeModel.
- Parameters:
model_config (AeroTransformerConfig) – Model configuration. See
ModelBaseConfigfor available options.update_counter – The
UpdateCounterprovided to the optimizer.is_frozen – If true, will set requires_grad of all parameters to false. Will also put the model into eval mode (e.g., to put Dropout or BatchNorm into eval mode).
path_provider –
PathProviderused by the initializer to store or retrieve checkpoints.data_container –
DataContainerwhich includes the data and dataloader. This is currently unused but helpful for quick prototyping only, evaluating forward in debug mode, etc.
- data_specs¶
- use_rope¶
- pos_embed¶
- surface_bias¶
- volume_bias¶
- use_physics_features¶
- backbone¶
- norm¶
- out¶
- forward(surface_position, volume_position, surface_features=None, volume_features=None)¶
- Parameters:
surface_position (torch.Tensor)
volume_position (torch.Tensor)
surface_features (torch.Tensor | None)
volume_features (torch.Tensor | None)
- Return type:
- class noether.modeling.models.AeroTransformerConfig(/, **data)¶
Bases:
noether.core.schemas.models.TransformerConfigTransformer config extended with aerodynamic data specifications.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- data_specs: noether.core.schemas.dataset.ModelDataSpecs¶
- class noether.modeling.models.AeroTransolver(model_config, **kwargs)¶
Bases:
noether.core.models.ModelAerodynamic Transolver wrapper.
Like
AeroTransformerbut adds the Transolver-specific learnable placeholder parameter.Base class for single models, i.e. one model with one optimizer as opposed to CompositeModel.
- Parameters:
model_config (AeroTransolverConfig) – Model configuration. See
ModelBaseConfigfor available options.update_counter – The
UpdateCounterprovided to the optimizer.is_frozen – If true, will set requires_grad of all parameters to false. Will also put the model into eval mode (e.g., to put Dropout or BatchNorm into eval mode).
path_provider –
PathProviderused by the initializer to store or retrieve checkpoints.data_container –
DataContainerwhich includes the data and dataloader. This is currently unused but helpful for quick prototyping only, evaluating forward in debug mode, etc.
- data_specs¶
- pos_embed¶
- surface_bias¶
- volume_bias¶
- use_physics_features¶
- placeholder¶
- backbone¶
- norm¶
- out¶
- forward(surface_position, volume_position, surface_features=None, volume_features=None)¶
- Parameters:
surface_position (torch.Tensor)
volume_position (torch.Tensor)
surface_features (torch.Tensor | None)
volume_features (torch.Tensor | None)
- Return type:
- class noether.modeling.models.AeroTransolverConfig(/, **data)¶
Bases:
noether.core.schemas.models.TransolverConfigTransolver config extended with aerodynamic data specifications.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- data_specs: noether.core.schemas.dataset.ModelDataSpecs¶
- class noether.modeling.models.AeroUPT(model_config, **kwargs)¶
Bases:
noether.core.models.ModelAerodynamic UPT wrapper.
Combines separate surface/volume query positions into the single
query_positionthat the core UPT expects, and splits outputs usingModelDataSpecs. Supports optional surface/volume bias layers and physics feature projection on queries.Base class for single models, i.e. one model with one optimizer as opposed to CompositeModel.
- Parameters:
model_config (noether.core.schemas.models.UPTConfig) – Model configuration. See
ModelBaseConfigfor available options.update_counter – The
UpdateCounterprovided to the optimizer.is_frozen – If true, will set requires_grad of all parameters to false. Will also put the model into eval mode (e.g., to put Dropout or BatchNorm into eval mode).
path_provider –
PathProviderused by the initializer to store or retrieve checkpoints.data_container –
DataContainerwhich includes the data and dataloader. This is currently unused but helpful for quick prototyping only, evaluating forward in debug mode, etc.
- backbone¶
- data_specs¶
- use_bias_layers¶
- use_physics_features¶
- forward(surface_position_batch_idx, surface_position_supernode_idx, surface_position, surface_query_position, volume_query_position, surface_features=None, volume_features=None)¶
- Parameters:
surface_position_batch_idx (torch.Tensor)
surface_position_supernode_idx (torch.Tensor)
surface_position (torch.Tensor)
surface_query_position (torch.Tensor)
volume_query_position (torch.Tensor)
surface_features (torch.Tensor | None)
volume_features (torch.Tensor | None)
- Return type:
- class noether.modeling.models.Transformer(config)¶
Bases:
torch.nn.ModuleImplementation of a Transformer model.
- Parameters:
config (noether.core.schemas.models.TransformerConfig) – Configuration of the Transformer model.
- blocks¶
- forward(x, attn_kwargs)¶
Forward pass of the Transformer model.
- Parameters:
x (torch.Tensor) – Input tensor of shape (batch_size, seq_len, hidden_dim).
attn_kwargs (dict[str, torch.Tensor]) – Additional arguments for the attention mechanism.
- Returns:
Output tensor after processing through the Transformer model.
- Return type:
- class noether.modeling.models.UPT(config)¶
Bases:
torch.nn.ModuleImplementation of the UPT (Universal Physics Transformer) model.
- Parameters:
config (noether.core.schemas.models.UPTConfig) – Configuration for the UPT model. See
UPTConfigfor details.
- use_rope¶
- encoder¶
- pos_embed¶
- approximator_blocks¶
- decoder¶
- norm¶
- prediction_layer¶
- compute_rope_args(geometry_batch_idx, geometry_position, geometry_supernode_idx, query_position)¶
Compute the RoPE frequency arguments for the geometry and query positions. If RoPE is not used, return empty dicts.
- Parameters:
geometry_batch_idx (torch.Tensor)
geometry_position (torch.Tensor)
geometry_supernode_idx (torch.Tensor)
query_position (torch.Tensor)
- Return type:
tuple[dict[str, torch.Tensor], dict[str, torch.Tensor]]
- forward(geometry_batch_idx, geometry_supernode_idx, geometry_position, query_position)¶
Forward pass of the UPT model.
- Parameters:
geometry_batch_idx (torch.Tensor) – Batch indices for the geometry positions.
geometry_supernode_idx (torch.Tensor) – Supernode indices for the geometry positions.
geometry_position (torch.Tensor) – Input coordinates of the geometry mesh points.
query_position (torch.Tensor) – Input coordinates of the query points.
- Returns:
Output tensor containing the predictions at query positions.
- Return type: