noether.core.callbacks.default.dataset_stats¶
Classes¶
A callback that logs the length of each dataset in the data container. Is initialized by the |
Module Contents¶
- class noether.core.callbacks.default.dataset_stats.DatasetStatsCallback(trainer, model, data_container, tracker, log_writer, checkpoint_writer, metric_property_provider, name=None)¶
Bases:
noether.core.callbacks.base.CallbackBaseA callback that logs the length of each dataset in the data container. Is initialized by the
BaseTrainerand should not be added manually to the trainer’s callbacks.- Parameters:
trainer (noether.training.trainers.BaseTrainer) – Trainer of the current run.
model (noether.core.models.ModelBase) – Model of the current run.
data_container (noether.data.container.DataContainer) –
DataContainerinstance that provides access to all datasets.tracker (noether.core.trackers.BaseTracker) –
BaseTrackerinstance to log metrics to stdout/disk/online platform.log_writer (noether.core.writers.LogWriter) –
LogWriterinstance to log metrics to stdout/disk/online platform.checkpoint_writer (noether.core.writers.CheckpointWriter) –
CheckpointWriterinstance to save checkpoints during training.metric_property_provider (noether.core.providers.metric_property.MetricPropertyProvider) –
MetricPropertyProviderinstance to access properties of metrics.name (str | None) – Name of the callback.
- before_training(**_)¶
Hook called once before the training loop starts.
This method is intended to be overridden by derived classes to perform initialization tasks before training begins. Common use cases include:
Initializing experiment tracking (e.g., logging hyperparameters)
Printing model summaries or architecture details
Initializing specific data structures or buffers needed during training
Performing sanity checks on the data or configuration
Note
This method is executed within a
torch.no_grad()context.- Parameters:
update_counter –
UpdateCounterinstance to access current training progress.- Return type:
None