noether.core.schemas.normalizers¶
Attributes¶
Classes¶
Functions¶
Module Contents¶
- noether.core.schemas.normalizers.validate_tensor(v)¶
- Parameters:
v (Any)
- Return type:
torch.Tensor
- noether.core.schemas.normalizers.TorchTensor¶
- noether.core.schemas.normalizers.FloatOrArray¶
- noether.core.schemas.normalizers.SequenceOrTensor¶
- class noether.core.schemas.normalizers.MeanStdNormalizerConfig(/, **data)¶
Bases:
pydantic.BaseModel- Parameters:
data (Any)
- mean: TorchTensor¶
mean to subtract from the input data. Can be a single value or a Sequence if we want to apply a different mean per dimension.
- std: TorchTensor¶
standard deviation to divide the input data by. Can be a single value or a Sequence if we want to apply a different std per dimension.
- class noether.core.schemas.normalizers.PositionNormalizerConfig(/, **data)¶
Bases:
pydantic.BaseModel- Parameters:
data (Any)
- raw_pos_min: TorchTensor¶
Minimum raw position values of the entire simulation mesh. Can be a single value or a sequence of values.
- raw_pos_max: TorchTensor¶
Maximum raw position values of the entire simulation mesh. Can be a single value or a sequence of values.
- scale: float = None¶
Scaling factor, the coordinates will be scaled linearly between [0, scale]. Defaults to 1000.
- check_min_max()¶
- Return type:
Self
- class noether.core.schemas.normalizers.ShiftAndScaleNormalizerConfig(/, **data)¶
Bases:
pydantic.BaseModel- Parameters:
data (Any)
- shift: TorchTensor¶
Value to subtract from the input data. Can be a single value or a Sequence if we want to apply a different shift per dimension. Assumed in log scale if logscale is True.
- scale: TorchTensor¶
Value to divide the input data by. Can be a single value or a Sequence if we want to apply a different scale per dimension. Assumed in log scale if logscale is True.
- check_shift_scale()¶
- Return type:
Self
- noether.core.schemas.normalizers.AnyNormalizer¶