Skip to content

Scheduler

continuiti.trainer.scheduler

Learning rate scheduler for Trainer in continuiti.

LinearLRScheduler(optimizer, max_epochs)

Bases: Callback

Callback for a linear learning rate scheduler.

lr(epoch) = lr0 * (1 - epoch / max_epochs)

where lr0 is the initial learning rate of the optimizer.

PARAMETER DESCRIPTION
optimizer

Optimizer. The learning rate of the first parameter group will be updated.

TYPE: Optimizer

max_epochs

Maximum number of epochs.

TYPE: int

Source code in src/continuiti/trainer/scheduler.py
def __init__(self, optimizer: torch.optim.Optimizer, max_epochs: int):
    self.optimizer = optimizer
    self.max_epochs = max_epochs

    lr0 = self.optimizer.param_groups[0]["lr"]
    self.schedule = lambda epoch: lr0 * (1 - epoch / max_epochs)

Last update: 2024-08-22
Created: 2024-08-22