noether.core.schemas.callbacks¶
Attributes¶
Classes¶
Module Contents¶
- class noether.core.schemas.callbacks.CallBackBaseConfig(/, **data)¶
Bases:
pydantic.BaseModel- Parameters:
data (Any)
- every_n_epochs: int | None = None¶
Epoch-based interval. Invokes the callback after every n epochs. Mutually exclusive with other intervals.
- every_n_updates: int | None = None¶
Update-based interval. Invokes the callback after every n updates. Mutually exclusive with other intervals.
- every_n_samples: int | None = None¶
Sample-based interval. Invokes the callback after every n samples. Mutually exclusive with other intervals.
- batch_size: int | None = None¶
None (use the same batch_size as for training).
- Type:
Batch size to use for this callback. Default
- model_config¶
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- validate_callback_frequency()¶
Ensures that exactly one frequency (‘every_n_*’) is specified and that ‘batch_size’ is present if ‘every_n_samples’ is used.
- Return type:
- classmethod check_positive_values(v)¶
Ensures that all integer-based frequency and batch size fields are positive.
- class noether.core.schemas.callbacks.BestCheckpointCallbackConfig(/, **data)¶
Bases:
CallBackBaseConfig- Parameters:
data (Any)
- name: Literal['BestCheckpointCallback'] = None¶
- tolerances: list[int] | None = None¶
“If provided, this callback will produce multiple best models which differ in the amount of intervals they allow the metric to not improve. For example, tolerance=[5] with every_n_epochs=1 will store a checkpoint where at most 5 epochs have passed until the metric improved. Additionally, the best checkpoint over the whole training will always be stored (i.e., tolerance=infinite). When setting different tolerances, one can evaluate different early stopping configurations with one training run.
- class noether.core.schemas.callbacks.CheckpointCallbackConfig(/, **data)¶
Bases:
CallBackBaseConfig- Parameters:
data (Any)
- name: Literal['CheckpointCallback'] = None¶
- save_latest_weights: bool = None¶
Whether to save the latest weights of the model. Note that the latest weights are always overwritten on the next invocation of this callback.
- class noether.core.schemas.callbacks.EmaCallbackConfig(/, **data)¶
Bases:
CallBackBaseConfig- Parameters:
data (Any)
- name: Literal['EmaCallback'] = None¶
- model_paths: list[str | None] | None = None¶
The paths to the models to apply the EMA to (i.e., composite_model.encoder/composite_model.decoder, path of the PyTorch nn.Modules in the checkpoint). If None, the EMA is applied to the whole model. When training with a CompositeModel, the paths on the submodules (i.e., ‘encoder’, ‘decoder’, etc.) should be provided via this field, otherwise the EMA will be applied to the CompositeModel as a whole which is not possible to restore later on.
- class noether.core.schemas.callbacks.OnlineLossCallbackConfig(/, **data)¶
Bases:
CallBackBaseConfig- Parameters:
data (Any)
- name: Literal['OnlineLossCallback'] = None¶
- class noether.core.schemas.callbacks.BestMetricCallbackConfig(/, **data)¶
Bases:
CallBackBaseConfig- Parameters:
data (Any)
- name: Literal['BestMetricCallback'] = None¶
The metric to use to dermine whether the current model obtained a new best (e.g., loss/valid/total)
- class noether.core.schemas.callbacks.TrackAdditionalOutputsCallbackConfig(/, **data)¶
Bases:
CallBackBaseConfig- Parameters:
data (Any)
- name: Literal['TrackAdditionalOutputsCallback'] = None¶
- keys: list[str] | None = None¶
List of patterns to track. Matched if it is contained in one of the update_outputs keys.
- patterns: list[str] | None = None¶
List of patterns to track. Matched if it is contained in one of the update_outputs keys.
- reduce: Literal['mean', 'last'] = None¶
The reduction method to be applied to the tracked values to reduce to scalar. Currently supports ‘mean’ and ‘last’.
- class noether.core.schemas.callbacks.OfflineLossCallbackConfig(/, **data)¶
Bases:
CallBackBaseConfig- Parameters:
data (Any)
- name: Literal['OfflineLossCallback'] = None¶
- class noether.core.schemas.callbacks.MetricEarlyStopperConfig(/, **data)¶
Bases:
CallBackBaseConfig- Parameters:
data (Any)
- name: Literal['MetricEarlyStopper'] = None¶
- class noether.core.schemas.callbacks.FixedEarlyStopperConfig(/, **data)¶
Bases:
pydantic.BaseModel- Parameters:
data (Any)
- name: Literal['FixedEarlyStopper'] = None¶
- validate_callback_frequency()¶
Ensures that exactly one stop (‘stop_at_*’) is specified
- Return type:
- noether.core.schemas.callbacks.CallbacksConfig¶