Skip to content

aiwithshekhar/cyclic-learning-schedulers-pytorch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Cyclic learning rate schedulers -PyTorch

Implementation

Cyclic learning rate schedules -

  • cyclic cosine annealing - CycilcCosAnnealingLR()
  • cyclic linear decay - CyclicLinearLR()

Requirements

  • numpy
  • python >= 2.7
  • PyTorch >= 0.4.0

Reference

SGDR: Stochastic Gradient Descent with Warm Restarts

Usage

Sample - (follow similarly for CyclicLinearLR)

from cyclicLR import CyclicCosAnnealingLR
import torch

optimizer = torch.optim.SGD(lr=1e-3)
scheduler = CyclicCosAnnealingLR(optimizer,milestones=[30,80],eta_min=1e-6)
for epoch in range(100):
  scheduler.step()
  train(..)
  validate(..)

Note: scheduler.step() shown is called at every epoch. It can be called even in every batch. Remember to specify milestones in number of batches (and not number of epochs) in such as case.

Visualization

Cyclic Cosine Annealing Learning Rate Schedule Cosine LR

Cyclic Linear Learning Rate Schedule Linear LR

About

A PyTorch Implementation of popular cyclic learning rate schedules

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%