noether.core.callbacks.default.progress¶
Classes¶
Callback to print the progress of the training such as number of epochs and updates. |
Module Contents¶
- class noether.core.callbacks.default.progress.ProgressCallback(callback_config, trainer, model, data_container, tracker, log_writer, checkpoint_writer, metric_property_provider, name=None)¶
Bases:
noether.core.callbacks.periodic.PeriodicCallbackCallback to print the progress of the training such as number of epochs and updates.
This callback is initialized by the
BaseTrainerand should not be added manually to the trainer’s callbacks.- Parameters:
callback_config (noether.core.schemas.callbacks.CallBackBaseConfig) – Configuration for the callback. See
CallBackBaseConfigfor available options.trainer (noether.training.trainers.BaseTrainer) – Trainer of the current run.
model (noether.core.models.ModelBase) – Model of the current run.
data_container (noether.data.container.DataContainer) –
DataContainerinstance that provides access to all datasets.tracker (noether.core.trackers.BaseTracker) –
BaseTrackerinstance to log metrics to stdout/disk/online platform.log_writer (noether.core.writers.LogWriter) –
LogWriterinstance to log metrics.checkpoint_writer (noether.core.writers.CheckpointWriter) –
CheckpointWriterinstance to save checkpoints.metric_property_provider (noether.core.providers.MetricPropertyProvider) –
MetricPropertyProviderinstance to access properties of metrics.name (str | None) – Name of the callback.
- before_training(**_)¶
Hook called once before the training loop starts.
This method is intended to be overridden by derived classes to perform initialization tasks before training begins. Common use cases include:
Initializing experiment tracking (e.g., logging hyperparameters)
Printing model summaries or architecture details
Initializing specific data structures or buffers needed during training
Performing sanity checks on the data or configuration
Note
This method is executed within a
torch.no_grad()context.- Parameters:
update_counter –
UpdateCounterinstance to access current training progress.- Return type:
None
- periodic_callback(*, interval_type, update_counter, **_)¶
Hook called periodically based on the configured intervals.
This method is the primary entry point for periodic actions in subclasses. It is triggered when any of the configured intervals (
every_n_epochs,every_n_updates, orevery_n_samples) are reached.Subclasses should override this method to implement periodic logic such as:
Calculating and logging expensive validation metrics
Saving specific model checkpoints or artifacts
Visualizing training progress (e.g., plotting samples)
Adjusting training hyperparameters or model state
Note
This method is executed within a
torch.no_grad()context.- Parameters:
interval_type – “epoch”, “update”, “sample” or “eval” indicating which interval triggered this callback.
update_counter (noether.core.utils.training.UpdateCounter) –
UpdateCounterinstance providing details about the current training progress (epoch, update, sample counts).**kwargs – Additional keyword arguments passed from the triggering hook (e.g., from
after_epoch()orafter_update()).
- Return type:
None
- track_after_update_step(*, update_counter, **_)¶
Hook called after each optimizer update step.
This method is invoked after a successful optimizer step and parameter update. It is typically used for tracking metrics that should be recorded once per update cycle, such as:
Latest loss values
Learning rates
Model parameter statistics (norms, etc.)
Training throughput and timing measurements
Unlike
periodic_callback(), this hook is called on every update step, making it suitable for maintaining running averages or high-frequency telemetry.Note
This method is executed within a
torch.no_grad()context.- Parameters:
update_counter (noether.core.utils.training.UpdateCounter) –
UpdateCounterinstance to access current training progress.times – Dictionary containing time measurements for various parts of the training step (e.g., ‘data_time’, ‘forward_time’, ‘backward_time’, ‘update_time’).
- Return type:
None