Skip to content

Commit 7ed6649

Browse files
authored
alpha is not needed for adopt (calculated internally using lr)
1 parent 8a85afe commit 7ed6649

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

GANDLF/optimizers/thirdparty/adopt.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -514,7 +514,6 @@ def adopt_wrapper(parameters: dict) -> torch.optim.Optimizer:
514514
lr=parameters.get("learning_rate", 1e-3),
515515
betas=parameters.get("betas", (0.9, 0.999, 0.9999)),
516516
eps=parameters.get("eps", 1e-8),
517-
alpha=parameters.get("alpha", 5.0),
518517
weight_decay=parameters.get("weight_decay", 0.0),
519518
decoupled=parameters["optimizer"].get("decoupled", False),
520519
foreach=parameters.get("foreach", None),

0 commit comments

Comments
 (0)