Skip to content

Support resume training with a different optimMethod. #1460

@qiuxin2012

Description

@qiuxin2012
Contributor

When we are tunning the hyperParameters of SGD, we should find when to decay the learningRate, so we need to resume training with many different optimMethod.
To support this, we should have ability to set epoch, nevals and evalCount to optimMethod.

Activity

self-assigned this
on Aug 15, 2017
yiheng

yiheng commented on Aug 23, 2017

@yiheng
Contributor

I think you can update the optim state after loaded it from file, right?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

    Development

    No branches or pull requests

      Participants

      @yiheng@qiuxin2012

      Issue actions

        Support resume training with a different optimMethod. · Issue #1460 · intel/ipex-llm