noether.core.initializers

Submodules

Classes

InitializerBase

Helper class that provides a standard way to create an ABC using

CheckpointInitializer

Base class to initialize models from checkpoints of previous runs. Should not be used directly, but inherited by other initializers such as PreviousRunInitializer or ResumeInitializer.

PreviousRunInitializer

Initializes a model from a checkpoint of a previous run (specified by the run_id), this initializers hence only loads model weights.

ResumeInitializer

Initializes models/optimizers from a checkpoint ready for resuming training.

Package Contents

class noether.core.initializers.InitializerBase(path_provider)

Bases: abc.ABC

Helper class that provides a standard way to create an ABC using inheritance.

Base class for model initializers.

Parameters:

path_provider (noether.core.providers.PathProvider) – PathProvider instance to access paths to load models from.

logger
path_provider
abstractmethod init_weights(model)

Initialize the model weights from the checkpoint.

Parameters:

model (noether.core.models.base.ModelBase) – the model to load the weights into.

Return type:

None

abstractmethod init_optimizer(model)

Initialize the optimizer for the model.

Parameters:

model (noether.core.models.base.ModelBase) – a model to initialize the optimizer for. Assumes the model has an attribute optim.

Return type:

None

init_trainer(trainer)

Initialize the trainer from the checkpoint.

By default, does nothing. Can be overridden by child classes.

Parameters:

trainer – the trainer to initialize.

Return type:

None

init_callbacks(callbacks, model)

Initialize the callbacks from the checkpoint.

By default, does nothing. Can be overridden by child classes.

Parameters:
Return type:

None

start_checkpoint()

Get the start checkpoint for the model.

By default , returns a TrainingIteration starting from zero.

Returns:

the start checkpoint for the model.

Return type:

TrainingIteration

class noether.core.initializers.CheckpointInitializer(initializer_config, **kwargs)

Bases: noether.core.initializers.base.InitializerBase

Base class to initialize models from checkpoints of previous runs. Should not be used directly, but inherited by other initializers such as PreviousRunInitializer or ResumeInitializer.

Parameters:
checkpoint_tag: str | noether.core.utils.training.training_iteration.TrainingIteration
run_id
model_name
load_optim
model_info
pop_ckpt_kwargs_keys
stage_name
init_run_path_provider
init_optimizer(model)

Initialize the optimizer for the model if it is derived from Model.

If model is a CompositeModel, nothing happens. This is expected as CompositeModels can be arbitrarily nested and do not have an optimizer. Instead, a CompositeModel calls init_optim with all its submodels which can be of type Model or a nested CompositeModel.

Parameters:

model (noether.core.models.ModelBase) – a model to initialize the optimizer for. Assumes the model has an attribute optim.

Return type:

None

class noether.core.initializers.PreviousRunInitializer(initializer_config, **kwargs)

Bases: noether.core.initializers.CheckpointInitializer

Initializes a model from a checkpoint of a previous run (specified by the run_id), this initializers hence only loads model weights. When a previous run should be resumed for further training, use ResumeInitializer instead. This initializer needs to be initialized as part of a model config. It is possible to remove certain keys or patterns from the checkpoint before loading it into the model, or to rename certain patterns.

For example:

model:
  kind: path.to.MyModelClass
  param1: value1
  name: my_model
  initializers:
    - kind: noether.core.initializers.PreviousRunInitializer
    run_id: <run_id>
    model_name: transformer
    stage_name: train
    checkpoint_tag: last
    keys_to_remove:
        - encoder.block1.weight
Parameters:
keys_to_remove
patterns_to_remove
patterns_to_rename
patterns_to_instantiate
init_weights(model, model_name=None)

Initialize the model weights from the checkpoint.

Parameters:
Return type:

None

class noether.core.initializers.ResumeInitializer(initializer_config, **kwargs)

Bases: noether.core.initializers.checkpoint.CheckpointInitializer

Initializes models/optimizers from a checkpoint ready for resuming training. Needs to be configured as part of the config by setting the resume_run_id in the root config (assumg the same output path is used). This initializer assumes that the previous run is going to be resumed for further training (i.e., trianing is not finished yet)

For example (config snippet is part of the trainer config):

resume_run_id: <previous_run_id>
Parameters:
init_weights(model)

Initialize the model weights from the checkpoint.

Parameters:

model (noether.core.models.ModelBase) – the model to load the weights into.

Return type:

None

init_optimizer(model)

Initialize the optimizer for the model.

Parameters:

model (noether.core.models.ModelBase) – a model to initialize the optimizer for.

Return type:

None

start_checkpoint()

Get the start checkpoint for the model.

Returns:

the start checkpoint for the model.

Return type:

TrainingIteration

init_trainer(trainer)

Initialize the trainer from the checkpoint.

Parameters:

trainer (noether.training.trainers.BaseTrainer) – the trainer to initialize.

Return type:

None

init_callbacks(callbacks, model)

Initialize the callbacks from the checkpoint.

Parameters:
Return type:

None