-
Notifications
You must be signed in to change notification settings - Fork 16
Add wandb log #92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Add wandb log #92
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -133,6 +133,7 @@ results/ | |
| ckp/ | ||
| checkpoints/ | ||
| *.swp | ||
| wandb/ | ||
|
|
||
| Dockerfile | ||
| build_dgx.sh | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -25,11 +25,14 @@ | |
|
|
||
| # This is used ONLY if you are not using argparse to get the hparams | ||
| default_cfg = { | ||
| "project_name": "micromind", | ||
| "output_folder": "results", | ||
| "experiment_name": "micromind_exp", | ||
| "opt": "adam", # this is ignored if you are overriding the configure_optimizers | ||
| "lr": 0.001, # this is ignored if you are overriding the configure_optimizers | ||
| "debug": False, | ||
| "log_wandb": False, | ||
| "wandb_resume": "auto", # ["allow", "must", "never", "auto" or None] | ||
| } | ||
|
|
||
|
|
||
|
|
@@ -381,14 +384,26 @@ def compute_macs(self, input_shape: Union[List, Tuple]): | |
|
|
||
| def on_train_start(self): | ||
| """Initializes the optimizer, modules and puts the networks on the right | ||
| devices. Optionally loads checkpoint if already present. | ||
| devices. Optionally loads checkpoint if already present. It also start wandb | ||
| logger if selected. | ||
|
|
||
| This function gets executed at the beginning of every training. | ||
| """ | ||
|
|
||
| # pass debug status to checkpointer | ||
| self.checkpointer.debug = self.hparams.debug | ||
|
|
||
| if self.hparams.log_wandb: | ||
| import wandb | ||
|
|
||
| self.wlog = wandb.init( | ||
| project=self.hparams.project_name, | ||
| name=self.hparams.experiment_name, | ||
| resume=self.hparams.wandb_resume, | ||
| id=self.hparams.experiment_name, | ||
| config=self.hparams, | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. here we can check if the configuration provides any extra arguments to be passed to the init. eventually, we load them (check the usage of ** operator). |
||
| ) | ||
|
|
||
| init_opt = self.configure_optimizers() | ||
| if isinstance(init_opt, list) or isinstance(init_opt, tuple): | ||
| self.opt, self.lr_sched = init_opt | ||
|
|
@@ -449,6 +464,8 @@ def init_devices(self): | |
|
|
||
| def on_train_end(self): | ||
| """Runs at the end of each training. Cleans up before exiting.""" | ||
| if self.hparams.log_wandb: | ||
| self.wlog.finish() | ||
| pass | ||
|
|
||
| def eval(self): | ||
|
|
@@ -531,6 +548,9 @@ def train( | |
| # ok for cos_lr | ||
| self.lr_sched.step() | ||
|
|
||
| if self.hparams.log_wandb: | ||
| self.wlog.log({"lr": self.lr_sched.get_last_lr()}) | ||
|
|
||
| for m in self.metrics: | ||
| if ( | ||
| self.current_epoch + 1 | ||
|
|
@@ -574,6 +594,10 @@ def train( | |
| else: | ||
| val_metrics = train_metrics.update({"val_loss": loss_epoch / (idx + 1)}) | ||
|
|
||
| if self.hparams.log_wandb: # wandb log | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I might be wrong, but it would be nicer if we divide the train and valid panes inside the wandb logging. this would look something like |
||
| self.wlog.log(train_metrics) | ||
| self.wlog.log(val_metrics) | ||
|
|
||
| if e >= 1 and self.debug: | ||
| break | ||
|
|
||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure what is the best way to do this. But we need to define better the effect of each field of
default_cfg.Maybe a long comment (
"""xx""") after the closing bracket?