Replies: 2 comments
-
|
Not without a checkpoint converter. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Marking as stale. No activity in 60 days. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Your question
Is it possible to load an optimizer that was previously saved using a distributed optimizer configuration, and then continue the training without employing a distributed optimizer?
Beta Was this translation helpful? Give feedback.
All reactions