graphorge.gnn_base_model.train.training.get_learning_rate_scheduler¶
- get_learning_rate_scheduler(optimizer, scheduler_type, **kwargs)[source]¶
Get PyTorch optimizer learning rate scheduler.
- Parameters:
optimizer (torch.optim.Optimizer) – PyTorch optimizer.
lr_scheduler_type ({'steplr', 'explr', 'linlr'}) –
Type of learning rate scheduler:
’steplr’ : Step-based decay (torch.optim.lr_scheduler.SetpLR)
’explr’ : Exponential decay (torch.optim.lr_scheduler.ExponentialLR)
’linlr’ : Linear decay (torch.optim.lr_scheduler.LinearLR)
**kwargs – Arguments of torch.optim.lr_scheduler.LRScheduler initializer.
- Returns:
scheduler – PyTorch optimizer learning rate scheduler.
- Return type:
torch.optim.lr_scheduler.LRScheduler