noether.core.callbacks.default.peak_memory

Classes

PeakMemoryCallback

Callback to log the peak memory usage of the model. Is initialized by the BaseTrainer and should not be added manually to the trainer's callbacks.

Module Contents

class noether.core.callbacks.default.peak_memory.PeakMemoryCallback(callback_config, trainer, model, data_container, tracker, log_writer, checkpoint_writer, metric_property_provider, name=None)

Bases: noether.core.callbacks.periodic.PeriodicCallback

Callback to log the peak memory usage of the model. Is initialized by the BaseTrainer and should not be added manually to the trainer’s callbacks.

Parameters:
periodic_callback(**__)

Hook called periodically based on the configured intervals.

This method is the primary entry point for periodic actions in subclasses. It is triggered when any of the configured intervals (every_n_epochs, every_n_updates, or every_n_samples) are reached.

Subclasses should override this method to implement periodic logic such as:

  • Calculating and logging expensive validation metrics

  • Saving specific model checkpoints or artifacts

  • Visualizing training progress (e.g., plotting samples)

  • Adjusting training hyperparameters or model state

Note

This method is executed within a torch.no_grad() context.

Parameters:
  • interval_type – “epoch”, “update”, “sample” or “eval” indicating which interval triggered this callback.

  • update_counterUpdateCounter instance providing details about the current training progress (epoch, update, sample counts).

  • **kwargs – Additional keyword arguments passed from the triggering hook (e.g., from after_epoch() or after_update()).

Return type:

None