Skip to content

Conversation

@Thakor-Yashpal
Copy link
Contributor

Resolves #<issue_number_goes_here>

Reference

Checklist

  • [] I have read the Contribution Guidelines and used pre-commit hooks to format this commit.
  • [] I have added all the necessary unit tests for my change. (run_model.py for model usage, test_outputs.py and/or model_validation_colab.ipynb for quality).
  • [] (If using an LLM) I have carefully reviewed and removed all superfluous comments or unneeded, commented-out code. Only necessary and functional code remains.
  • [] I have signed the Contributor License Agreement (CLA).

@jenriver
Copy link
Member

jenriver commented Nov 20, 2025

Hello Thakor, it'd be really cool to have a CLIP implementation here. It seems there's no file changes here, did you forget to push commit?

Related issue: #72

@Thakor-Yashpal
Copy link
Contributor Author

Hello Thakor, it'd be really cool to have a CLIP implementation here. It seems there's no file changes here, did you forget to push commit?

Relaetd issue: #72

I had pushed the initial code earlier, but the implementation wasn’t up to the quality I wanted, so I removed those files. I’m making the necessary improvements now and will push the updated files on Wednesday.

@Iamleos
Copy link
Contributor

Iamleos commented Dec 4, 2025

When CLIP will be merged?

@@ -0,0 +1,185 @@
from typing import Any
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can this be entirely used in tests/ insetad to follow pattern in https://github.com/jax-ml/bonsai/blob/main/CONTRIBUTING.md#contributing-a-model?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please update this commit to not remove the efficientnet files?


def _get_dtype(cfg: CLIPConfig):
return jnp.float32 if cfg.dtype == "float32" else jnp.float16

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like these layers are replicated here. Can we just have the definitions in modeling.py?

import jax.numpy as jnp
import flax.linen as nn
from flax.linen import initializers
from .params import CLIPConfig
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we have the config in this file for consistency with the rest of the repo?

self.text_embed_dim = 1024
self.proj_dim = 1024
else:
raise ValueError("Unknown model_size: " + str(self.model_size))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For users interested in inference, could you add functionality to transfer parameters from a pretrained model?

@chapman20j
Copy link
Collaborator

Hi @Thakor-Yashpal. Just following up on this PR. I left a few comments but thought I'd ask how things are going. Are you still actively developing CLIP?

@Thakor-Yashpal
Copy link
Contributor Author

Hi @Thakor-Yashpal. Just following up on this PR. I left a few comments but thought I'd ask how things are going. Are you still actively developing CLIP?

Yes, I am! My exams finish tomorrow, and once they're done I’ll be back to working on this PR. I’ll start addressing the comments and updating everything by tomorrow. Thanks for checking in!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants