noether.modeling.models

Submodules

Classes

AnchoredBranchedUPT

Implementation of the Anchored Branched UPT model. Including input embedding and output projection, so this is an off-the-shelf model that can be used directly by providing the appropriate input tensors.

AeroABUPT

Aerodynamic Anchored-Branched UPT wrapper.

AeroTransformer

Aerodynamic Transformer wrapper.

AeroTransformerConfig

Transformer config extended with aerodynamic data specifications.

AeroTransolver

Aerodynamic Transolver wrapper.

AeroTransolverConfig

Transolver config extended with aerodynamic data specifications.

AeroUPT

Aerodynamic UPT wrapper.

Transformer

Implementation of a Transformer model.

UPT

Implementation of the UPT (Universal Physics Transformer) model.

Package Contents

class noether.modeling.models.AnchoredBranchedUPT(config)

Bases: torch.nn.Module

Implementation of the Anchored Branched UPT model. Including input embedding and output projection, so this is an off-the-shelf model that can be used directly by providing the appropriate input tensors.

Parameters:

config (noether.core.schemas.models.AnchorBranchedUPTConfig) – Configuration for the AB-UPT model. See AnchorBranchedUPTConfig for details.

data_specs
rope
pos_embed
domain_names: list[str]
domain_biases
hidden_dim
physics_blocks
use_geometry_branch = False
domain_decoder_blocks
domain_decoder_projections
geometry_branch_forward(geometry_position, geometry_supernode_idx, geometry_batch_idx, condition, geometry_attn_kwargs)

Forward pass through the geometry branch of the model.

Parameters:
Return type:

torch.Tensor

physics_blocks_forward(domain_positions_all, geometry_encoding, physics_token_specs, physics_attn_kwargs, physics_perceiver_attn_kwargs, condition, kv_cache=None)

Forward pass through the physics blocks of the model.

Parameters:
Return type:

tuple[torch.Tensor, list[LayerCache]]

decoder_blocks_forward(x_physics, physics_token_specs, per_domain_token_specs, decoder_attn_kwargs, condition, kv_cache=None, domain_positions_all=None)

Forward pass through the per-domain decoder blocks.

Returns:

Tuple of (domain_predictions, new_domain_caches).

Parameters:
Return type:

tuple[dict[str, torch.Tensor], dict[str, list[LayerCache]]]

create_rope_frequencies(domain_positions_all, geometry_position=None, geometry_supernode_idx=None)

Create RoPE frequencies for all relevant positions.

Returns:

Tuple of (geometry_attn_kwargs, decoder_attn_kwargs, physics_perceiver_attn_kwargs, physics_attn_kwargs). decoder_attn_kwargs is keyed by domain name.

Parameters:
Return type:

tuple[dict[str, Any], dict[str, dict[str, Any]], dict[str, Any], dict[str, Any]]

forward(geometry_position=None, geometry_supernode_idx=None, geometry_batch_idx=None, domain_anchor_positions=None, domain_query_positions=None, domain_features=None, conditioning_inputs=None, kv_cache=None)

Forward pass of the AB-UPT model.

Example:

model(
    geometry_position=...,
    geometry_supernode_idx=...,
    geometry_batch_idx=...,
    domain_anchor_positions={"surface": surface_pos, "volume": volume_pos},
    domain_query_positions={"surface": query_pos},
    conditioning_inputs={"geometry_design_parameters": design_params},
)
Parameters:
  • geometry_position (torch.Tensor | None) – Coordinates of the geometry mesh. Tensor of shape (B * N_geometry, D_pos).

  • geometry_supernode_idx (torch.Tensor | None) – Supernode indices for the geometry points.

  • geometry_batch_idx (torch.Tensor | None) – Batch indices for the geometry points.

  • domain_anchor_positions (dict[str, torch.Tensor] | None) – Per-domain anchor positions, e.g. {"surface": (B, N, D), "volume": (B, M, D)}.

  • domain_query_positions (dict[str, torch.Tensor] | None) – Per-domain query positions (optional).

  • conditioning_inputs (dict[str, torch.Tensor] | None) – Conditioning tensors, e.g. {"geometry_design_parameters": (B, D)}.

  • kv_cache (ModelKVCache | None) – KV cache from a previous forward call.

  • domain_features (dict[str, torch.Tensor] | None)

Returns:

Tuple of (predictions, kv_cache).

Return type:

tuple[dict[str, torch.Tensor], ModelKVCache]

class noether.modeling.models.AeroABUPT(model_config, **kwargs)

Bases: noether.core.models.Model

Aerodynamic Anchored-Branched UPT wrapper.

Bridges the factory’s (config, **kwargs) instantiation pattern to the core model. Converts flat kwargs (surface_anchor_position, volume_anchor_position, …) into the domain-dict format expected by AnchoredBranchedUPT.

Base class for single models, i.e. one model with one optimizer as opposed to CompositeModel.

Parameters:
  • model_config (noether.core.schemas.models.AnchorBranchedUPTConfig) – Model configuration. See ModelBaseConfig for available options.

  • update_counter – The UpdateCounter provided to the optimizer.

  • is_frozen – If true, will set requires_grad of all parameters to false. Will also put the model into eval mode (e.g., to put Dropout or BatchNorm into eval mode).

  • path_providerPathProvider used by the initializer to store or retrieve checkpoints.

  • data_containerDataContainer which includes the data and dataloader. This is currently unused but helpful for quick prototyping only, evaluating forward in debug mode, etc.

backbone
forward(**kwargs)
Return type:

dict[str, torch.Tensor]

class noether.modeling.models.AeroTransformer(model_config, **kwargs)

Bases: noether.core.models.Model

Aerodynamic Transformer wrapper.

End-to-end forward for aero CFD: positional encoding, optional RoPE, optional physics features, surface/volume bias, Transformer backbone, output projection, and output gathering.

Base class for single models, i.e. one model with one optimizer as opposed to CompositeModel.

Parameters:
  • model_config (AeroTransformerConfig) – Model configuration. See ModelBaseConfig for available options.

  • update_counter – The UpdateCounter provided to the optimizer.

  • is_frozen – If true, will set requires_grad of all parameters to false. Will also put the model into eval mode (e.g., to put Dropout or BatchNorm into eval mode).

  • path_providerPathProvider used by the initializer to store or retrieve checkpoints.

  • data_containerDataContainer which includes the data and dataloader. This is currently unused but helpful for quick prototyping only, evaluating forward in debug mode, etc.

data_specs
use_rope
pos_embed
surface_bias
volume_bias
use_physics_features
backbone
norm
out
forward(surface_position, volume_position, surface_features=None, volume_features=None)
Parameters:
Return type:

dict[str, torch.Tensor]

class noether.modeling.models.AeroTransformerConfig(/, **data)

Bases: noether.core.schemas.models.TransformerConfig

Transformer config extended with aerodynamic data specifications.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

data_specs: noether.core.schemas.dataset.ModelDataSpecs
class noether.modeling.models.AeroTransolver(model_config, **kwargs)

Bases: noether.core.models.Model

Aerodynamic Transolver wrapper.

Like AeroTransformer but adds the Transolver-specific learnable placeholder parameter.

Base class for single models, i.e. one model with one optimizer as opposed to CompositeModel.

Parameters:
  • model_config (AeroTransolverConfig) – Model configuration. See ModelBaseConfig for available options.

  • update_counter – The UpdateCounter provided to the optimizer.

  • is_frozen – If true, will set requires_grad of all parameters to false. Will also put the model into eval mode (e.g., to put Dropout or BatchNorm into eval mode).

  • path_providerPathProvider used by the initializer to store or retrieve checkpoints.

  • data_containerDataContainer which includes the data and dataloader. This is currently unused but helpful for quick prototyping only, evaluating forward in debug mode, etc.

data_specs
pos_embed
surface_bias
volume_bias
use_physics_features
placeholder
backbone
norm
out
forward(surface_position, volume_position, surface_features=None, volume_features=None)
Parameters:
Return type:

dict[str, torch.Tensor]

class noether.modeling.models.AeroTransolverConfig(/, **data)

Bases: noether.core.schemas.models.TransolverConfig

Transolver config extended with aerodynamic data specifications.

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Parameters:

data (Any)

data_specs: noether.core.schemas.dataset.ModelDataSpecs
class noether.modeling.models.AeroUPT(model_config, **kwargs)

Bases: noether.core.models.Model

Aerodynamic UPT wrapper.

Combines separate surface/volume query positions into the single query_position that the core UPT expects, and splits outputs using ModelDataSpecs. Supports optional surface/volume bias layers and physics feature projection on queries.

Base class for single models, i.e. one model with one optimizer as opposed to CompositeModel.

Parameters:
  • model_config (noether.core.schemas.models.UPTConfig) – Model configuration. See ModelBaseConfig for available options.

  • update_counter – The UpdateCounter provided to the optimizer.

  • is_frozen – If true, will set requires_grad of all parameters to false. Will also put the model into eval mode (e.g., to put Dropout or BatchNorm into eval mode).

  • path_providerPathProvider used by the initializer to store or retrieve checkpoints.

  • data_containerDataContainer which includes the data and dataloader. This is currently unused but helpful for quick prototyping only, evaluating forward in debug mode, etc.

backbone
data_specs
use_bias_layers
use_physics_features
forward(surface_position_batch_idx, surface_position_supernode_idx, surface_position, surface_query_position, volume_query_position, surface_features=None, volume_features=None)
Parameters:
Return type:

dict[str, torch.Tensor]

class noether.modeling.models.Transformer(config)

Bases: torch.nn.Module

Implementation of a Transformer model.

Parameters:

config (noether.core.schemas.models.TransformerConfig) – Configuration of the Transformer model.

blocks
forward(x, attn_kwargs)

Forward pass of the Transformer model.

Parameters:
  • x (torch.Tensor) – Input tensor of shape (batch_size, seq_len, hidden_dim).

  • attn_kwargs (dict[str, torch.Tensor]) – Additional arguments for the attention mechanism.

Returns:

Output tensor after processing through the Transformer model.

Return type:

torch.Tensor

class noether.modeling.models.UPT(config)

Bases: torch.nn.Module

Implementation of the UPT (Universal Physics Transformer) model.

Parameters:

config (noether.core.schemas.models.UPTConfig) – Configuration for the UPT model. See UPTConfig for details.

use_rope
encoder
pos_embed
approximator_blocks
decoder
norm
prediction_layer
compute_rope_args(geometry_batch_idx, geometry_position, geometry_supernode_idx, query_position)

Compute the RoPE frequency arguments for the geometry and query positions. If RoPE is not used, return empty dicts.

Parameters:
Return type:

tuple[dict[str, torch.Tensor], dict[str, torch.Tensor]]

forward(geometry_batch_idx, geometry_supernode_idx, geometry_position, query_position)

Forward pass of the UPT model.

Parameters:
  • geometry_batch_idx (torch.Tensor) – Batch indices for the geometry positions.

  • geometry_supernode_idx (torch.Tensor) – Supernode indices for the geometry positions.

  • geometry_position (torch.Tensor) – Input coordinates of the geometry mesh points.

  • query_position (torch.Tensor) – Input coordinates of the query points.

Returns:

Output tensor containing the predictions at query positions.

Return type:

torch.Tensor