Skip to content

Update per batch, not per epoch  #15

@GLambard

Description

@GLambard

First of all, thank you very much for sharing your implementation of the Snapshot Ensembling. However, you should not use the callbacks.LearningRateScheduler to update the learning rate.

As mentioned in the original paper under the equation 2, "we update the learning rate at each iteration" (i.e batch) "rather than at every epoch" to improve the convergence of short cycles. But, callbacks.LearningRateScheduler in Keras, with the source code here, updates the learning rate at 'on_epoch_begin'.

After this fix, you should improve further your scores.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions