noether.core.schemas.modules¶
Submodules¶
Classes¶
Configuration for an attention module. |
|
Configuration for Cross Anchor Attention module. |
|
Configuration for the Dot Product attention module. |
|
Configuration for the Irregular Neighbourhood Attention Transformer (NAT) attention module. |
|
Configuration for Joint Anchor Attention module. |
|
Configuration for Mixed Attention module. |
|
Configuration for Multi-Branch Anchor Attention module. |
|
Configuration for the Perceiver attention module. |
|
Configuration for the Transolver attention module. |
|
Configuration for the Transolver++ attention module. |
|
Configuration for the PerceiverBlock module. |
|
Configuration for a transformer block. |
|
Configuration for the DeepPerceiverDecoder module. |
|
Configuration for Continuous Sine-Cosine Embedding layer. |
|
Configuration for Layer Scale module. |
|
Configuration for a LinearProjection layer. |
|
Configuration for RoPE frequency settings. |
|
Configuration for the UnquantizedDropPath layer. |
|
Package Contents¶
- class noether.core.schemas.modules.AttentionConfig(/, **data)¶
Bases:
pydantic.BaseModelConfiguration for an attention module. Since we can have many different attention implementations, we allow extra fields. such that we can use the same schema for all attention modules.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- model_config¶
Configuration for an attention module.
Dimensionality of the hidden features.
- init_weights: noether.core.types.InitWeightsMode = None¶
Weight initialization strategy.
- class noether.core.schemas.modules.CrossAnchorAttentionConfig(/, **data)¶
Bases:
MultiBranchAnchorAttentionConfigConfiguration for Cross Anchor Attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- class noether.core.schemas.modules.DotProductAttentionConfig(/, **data)¶
Bases:
AttentionConfigConfiguration for the Dot Product attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- class noether.core.schemas.modules.IrregularNatAttentionConfig(/, **data)¶
Bases:
AttentionConfigConfiguration for the Irregular Neighbourhood Attention Transformer (NAT) attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
Hidden dimensionality of the relative position bias MLP.
- class noether.core.schemas.modules.JointAnchorAttentionConfig(/, **data)¶
Bases:
MultiBranchAnchorAttentionConfigConfiguration for Joint Anchor Attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- class noether.core.schemas.modules.MixedAttentionConfig(/, **data)¶
Bases:
DotProductAttentionConfigConfiguration for Mixed Attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- class noether.core.schemas.modules.MultiBranchAnchorAttentionConfig(/, **data)¶
Bases:
AttentionConfigConfiguration for Multi-Branch Anchor Attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- class noether.core.schemas.modules.PerceiverAttentionConfig(/, **data)¶
Bases:
AttentionConfigConfiguration for the Perceiver attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- set_kv_dim()¶
- class noether.core.schemas.modules.TransolverAttentionConfig(/, **data)¶
Bases:
AttentionConfigConfiguration for the Transolver attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- class noether.core.schemas.modules.TransolverPlusPlusAttentionConfig(/, **data)¶
Bases:
TransolverAttentionConfigConfiguration for the Transolver++ attention module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- use_overparameterization: bool = None¶
Whether to use overparameterization for the slice projection.
- use_adaptive_temperature: bool = None¶
Whether to use an adaptive temperature for the slice selection.
- class noether.core.schemas.modules.PerceiverBlockConfig(/, **data)¶
Bases:
TransformerBlockConfigConfiguration for the PerceiverBlock module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- kv_dim: int | None = None¶
Dimensionality of the key and value representations. Defaults to None. If None, hidden_dim is used.
- set_kv_dim()¶
Set kv_dim to hidden_dim if not provided.
- Return type:
- perceiver_attention_config()¶
- modulation_linear_projection_config()¶
- Return type:
noether.core.schemas.modules.layers.LinearProjectionConfig | None
- class noether.core.schemas.modules.TransformerBlockConfig(/, **data)¶
Bases:
pydantic.BaseModelConfiguration for a transformer block.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
Hidden Dimension of the transformer block.
Hidden dimension of the MLP layer. If set to None, the mlp_hidden dim is set to hidden_dim * mlp_expansion_factor in the TransformerConfig. If both are None, an error is raised.
- mlp_expansion_factor: int | None = None¶
Expansion factor for the MLP hidden dimension relative to the hidden dimension. If ‘mlp_hidden_dim’ is not set, this factor is used to compute it as hidden_dim * mlp_expansion_factor.
- attention_constructor: Literal['dot_product', 'perceiver', 'transolver', 'transolver_plusplus'] = 'dot_product'¶
Constructor of the attention module. Defaults to ‘dot_product’.
- condition_dim: int | None = None¶
Dimension of the conditioning vector. If none, no conditioning is applied. If provided, the transformer block will turn into a Diffusion Transformer (DiT) block.
- init_weights: noether.core.types.InitWeightsMode = None¶
Initialization method for the weight matrices of the network. Defaults to “truncnormal002
- attention_arguments: dict¶
Additional arguments for the attention module that are only needed for a specific attention implementation.
- linear_projection_config()¶
- layerscale_config()¶
- Return type:
- drop_path_config()¶
- modulation_linear_projection_config()¶
- Return type:
LinearProjectionConfig | None
- up_act_down_mlp_config()¶
- Return type:
- class noether.core.schemas.modules.DeepPerceiverDecoderConfig(/, **data)¶
Bases:
noether.core.schemas.mixins.InjectSharedFieldFromParentMixin,pydantic.BaseModelConfiguration for the DeepPerceiverDecoder module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- perceiver_block_config: Annotated[noether.core.schemas.modules.blocks.PerceiverBlockConfig, noether.core.schemas.mixins.Shared] = None¶
Configuration for the Perceiver blocks used in the decoder.
- class noether.core.schemas.modules.SupernodePoolingConfig(/, **data)¶
Bases:
pydantic.BaseModel- Parameters:
data (Any)
Hidden dimension for positional embeddings, messages and the resulting output vector.
- input_dim: int = None¶
Number of positional dimension (e.g., input_dim=2 for a 2D position, input_dim=3 for a 3D position)
- radius: float | None = None¶
Radius around each supernode. From points within this radius, messages are passed to the supernode.
- k: int | None = None¶
Number of neighbors for each supernode. From the k-NN points, messages are passed to the supernode.
- spool_pos_mode: Literal['abspos', 'relpos', 'absrelpos'] = None¶
absolute space (“abspos”), relative space (“relpos”) or both (“absrelpos”).
- Type:
Type of position embedding
- init_weights: noether.core.types.InitWeightsMode = None¶
Weight initialization of linear layers. Defaults to “truncnormal002”.
- readd_supernode_pos: bool = None¶
If true, the absolute positional encoding of the supernode is concatenated to the supernode vector after message passing and linearly projected back to hidden_dim. Defaults to True.
- aggregation: Literal['mean', 'sum'] = None¶
Aggregation for message passing (“mean” or “sum”).
- message_mode: Literal['mlp', 'linear', 'identity'] = None¶
How messages are created. “mlp” (2 layer MLP), “linear” (nn.Linear), “identity” (nn.Identity). Defaults to “mlp”.
- input_features_dim: int | None = None¶
Number of input features per point. None will fall back to a version without features. Defaults to None, which means no input features.
- validate_radius_and_k()¶
- class noether.core.schemas.modules.ContinuousSincosEmbeddingConfig(/, **data)¶
Bases:
pydantic.BaseModelConfiguration for Continuous Sine-Cosine Embedding layer.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
Dimensionality of the output embedding.
- class noether.core.schemas.modules.LayerScaleConfig(/, **data)¶
Bases:
pydantic.BaseModelConfiguration for Layer Scale module.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
Number of dimensions of the input tensor to be scaled.
- class noether.core.schemas.modules.LinearProjectionConfig(/, **data)¶
Bases:
pydantic.BaseModelConfiguration for a LinearProjection layer.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- ndim: None | int = None¶
Number of dimensions of the input domain. Either None (Linear projection), 1D (sequence), 2D, or 3D. Defaults to None.
- optional: bool = None¶
If true and input_dim==output_dim (i.e., there is no up/down projection), then the identity mapping is used. Defaults to False.
- init_weights: noether.core.types.InitWeightsMode = None¶
Initialization method of the weights of the MLP. Options are ‘torch’ (i.e., similar to the module) or ‘truncnormal002’, or ‘zero’. Defaults to ‘torch’.
- validate_ndim()¶
Validate the ndim field to ensure it is either None, 1, 2, or 3.
- Return type:
Self
- class noether.core.schemas.modules.RopeFrequencyConfig(/, **data)¶
Bases:
pydantic.BaseModelConfiguration for RoPE frequency settings.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
Dimensionality of frequencies (in transformers this should be the head dimension).
- input_dim: int = None¶
Dimensionality of the coordinates (e.g., 2 for 2D coordinates, 3 for 3D coordinates).
- max_wavelength: int = None¶
10000.0
- Type:
Theta parameter for the transformer sine/cosine embedding. Default
- implementation: Literal['real', 'complex'] = None¶
“real” -> basic implementation using real coordinates (this is slow and only here for backward compatibility). “complex” -> fast implementation of rotation via complex multiplication. Default: “real”.
- class noether.core.schemas.modules.UnquantizedDropPathConfig(/, **data)¶
Bases:
pydantic.BaseModelConfiguration for the UnquantizedDropPath layer.
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- Parameters:
data (Any)
- class noether.core.schemas.modules.MLPConfig(/, **data)¶
Bases:
pydantic.BaseModel- Parameters:
data (Any)
Hidden dimension for each layer.
- num_layers: int = None¶
Number of hidden layers in the MLP. If 0, the MLP is a two linear layer MLP from input_dim, hidden_dim, activation to output_dim.
- activation: Literal['RELU', 'GELU', 'SIGMOID', 'TANH', 'LEAKY_RELU', 'SOFTPLUS', 'ELU', 'SILU'] = 'GELU'¶
Activation function to use between layers.
- init_weights: noether.core.types.InitWeightsMode = 'truncnormal002'¶
Weight initialization method.
- class noether.core.schemas.modules.UpActDownMLPConfig(/, **data)¶
Bases:
pydantic.BaseModel- Parameters:
data (Any)
Hidden dimension of the MLP.
- init_weights: noether.core.types.InitWeightsMode = None¶
Initialization method of the weights of the MLP. Options are “torch” (i.e., similar to the module) or ‘truncnormal002’. Defaults to ‘truncnormal002’.
- check_dims()¶
Validator to check that hidden_dim is greater than input_dim.
- Raises:
ValueError – raised if hidden_dim is not greater than input_dim.
- Return type: