-
Notifications
You must be signed in to change notification settings - Fork 32
Add CLIP-JAX ADE20K implementation, training, and tests #83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
Hello Thakor, it'd be really cool to have a CLIP implementation here. It seems there's no file changes here, did you forget to push commit? Related issue: #72 |
I had pushed the initial code earlier, but the implementation wasn’t up to the quality I wanted, so I removed those files. I’m making the necessary improvements now and will push the updated files on Wednesday. |
…onsai into clip-jax-ade20k
|
When CLIP will be merged? |
| @@ -0,0 +1,185 @@ | |||
| from typing import Any | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can this be entirely used in tests/ insetad to follow pattern in https://github.com/jax-ml/bonsai/blob/main/CONTRIBUTING.md#contributing-a-model?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you please update this commit to not remove the efficientnet files?
|
|
||
| def _get_dtype(cfg: CLIPConfig): | ||
| return jnp.float32 if cfg.dtype == "float32" else jnp.float16 | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks like these layers are replicated here. Can we just have the definitions in modeling.py?
| import jax.numpy as jnp | ||
| import flax.linen as nn | ||
| from flax.linen import initializers | ||
| from .params import CLIPConfig |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we have the config in this file for consistency with the rest of the repo?
| self.text_embed_dim = 1024 | ||
| self.proj_dim = 1024 | ||
| else: | ||
| raise ValueError("Unknown model_size: " + str(self.model_size)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For users interested in inference, could you add functionality to transfer parameters from a pretrained model?
|
Hi @Thakor-Yashpal. Just following up on this PR. I left a few comments but thought I'd ask how things are going. Are you still actively developing CLIP? |
Yes, I am! My exams finish tomorrow, and once they're done I’ll be back to working on this PR. I’ll start addressing the comments and updating everything by tomorrow. Thanks for checking in! |
Resolves #<issue_number_goes_here>
Reference
Checklist
run_model.pyfor model usage,test_outputs.pyand/ormodel_validation_colab.ipynbfor quality).