Skip to content

Conversation

@neumyor
Copy link

@neumyor neumyor commented Aug 5, 2023

I was loading pretrained model and made a mistake in my config, then I found this funny bug in models/base_model.py.

fix it with a simple assert pretrain_path is not None, "Found load_finetuned is False, but pretrain_path is None."

def load_checkpoint_from_config(self, cfg, **kwargs):
        """
        Load checkpoint as specified in the config file.

        If load_finetuned is True, load the finetuned model; otherwise, load the pretrained model.
        When loading the pretrained model, each task-specific architecture may define their
        own load_from_pretrained() method.
        """
        load_finetuned = cfg.get("load_finetuned", True)
        if load_finetuned:
            finetune_path = cfg.get("finetuned", None)
            assert (
                finetune_path is not None
            ), "Found load_finetuned is True, but finetune_path is None."
            self.load_checkpoint(url_or_filename=finetune_path)
        else:
            load_pretrained = cfg.get("load_pretrained", True)
            if load_pretrained:
                # load pre-trained weights
                pretrain_path = cfg.get("pretrained", None)
                assert "Found load_finetuned is False, but pretrain_path is None."  # this is the bug, lol
                self.load_from_pretrained(url_or_filename=pretrain_path, **kwargs)

@salesforce-cla
Copy link

salesforce-cla bot commented Aug 5, 2023

Thanks for the contribution! Before we can merge this, we need @neumyor to sign the Salesforce Inc. Contributor License Agreement.

@neumyor neumyor closed this Aug 5, 2023
@neumyor neumyor reopened this Aug 5, 2023
@neumyor
Copy link
Author

neumyor commented Aug 5, 2023

Already signed the cla, pls have a review @dxli94

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant