The step counter for actually running the optimizer step in training appears to be relative to the epoch, not global. Therefore, if the number of steps per epoch is less than the number of accumulation steps, the optimizer step never triggers.
|
if (step + 1) % accumulation_steps == 0: |