Skip to content

cmpark0126/pytorch-polynomial-lr-decay

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

pytorch-polynomial-lr-decay

Polynomial Learning Rate Decay Scheduler for PyTorch

This scheduler is frequently used in many DL paper. But there is no official implementation in PyTorch. So I propose this code.

Install

$ pip install git+https://github.com/cmpark0126/pytorch-polynomial-lr-decay.git

Usage

from torch_poly_lr_decay import PolynomialLRDecay

scheduler_poly_lr_decay = PolynomialLRDecay(optim, max_decay_steps=100, end_learning_rate=0.0001, power=2.0)

for epoch in range(train_epoch):
    scheduler_poly_lr_decay.step()     # you can handle step as epoch number
    ...

or

from torch_poly_lr_decay import PolynomialLRDecay

scheduler_poly_lr_decay = PolynomialLRDecay(optim, max_decay_steps=100, end_learning_rate=0.0001, power=2.0)

...

for batch_idx, (inputs, targets) in enumerate(trainloader):
    scheduler_poly_lr_decay.step()     # also, you can handle step as each iter number

Releases

No releases published

Packages

No packages published

Languages