Skip to content

Add http_client support for OpenAI Embedding Function #5222

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

danilvalov
Copy link

Description of changes

This PR adds custom httpx client support to the OpenAI embedding function.

Test plan

I tested this as part of my project.

  • Tests pass locally with pytest for python, yarn test for js, cargo test for rust

Migration plan

No migrations or changes required

Documentation Changes

Since this is an optional feature that is needed by a small number of developers, I don't think it needs to be included in the documentation (as it is in the OpenAI client).

Copy link

github-actions bot commented Aug 7, 2025

Reviewer Checklist

Please leverage this checklist to ensure your code review is thorough before approving

Testing, Bugs, Errors, Logs, Documentation

  • Can you think of any use case in which the code does not behave as intended? Have they been tested?
  • Can you think of any inputs or external events that could break the code? Is user input validated and safe? Have they been tested?
  • If appropriate, are there adequate property based tests?
  • If appropriate, are there adequate unit tests?
  • Should any logging, debugging, tracing information be added or removed?
  • Are error messages user-friendly?
  • Have all documentation changes needed been made?
  • Have all non-obvious changes been commented?

System Compatibility

  • Are there any potential impacts on other parts of the system or backward compatibility?
  • Does this change intersect with any items on our roadmap, and if so, is there a plan for fitting them together?

Quality

  • Is this code of a unexpectedly high quality (Readability, Modularity, Intuitiveness)

Copy link
Contributor

Add Custom httpx Client Support to OpenAI Embedding Function

This PR introduces support for injecting a custom httpx.Client into the OpenAIEmbeddingFunction class in chromadb. By allowing an optional http_client parameter, users can now override default HTTP client behavior (e.g., for custom transport settings, timeouts, proxies) when generating embeddings. The support is applied both to standard OpenAI API usage and Azure OpenAI deployments. All necessary code paths respect the presence of http_client, including configuration persistence and construction utilities.

Key Changes

• Added an optional http_client (httpx.Client) parameter to OpenAIEmbeddingFunction constructor.
• Integrated http_client propagation through both OpenAI and Azure OpenAI client initialization.
• Updated docstrings to describe the new parameter and its intended usage.
• Ensured build_from_config() and get_config() methods support the new http_client parameter.
• Added handling to store and restore http_client within the class configuration.

Affected Areas

• chromadb/utils/embedding_functions/openai_embedding_function.py

This summary was automatically generated by @propel-code-bot

@alex-mcanulty
Copy link

Man, this feature would really help me out right now... We're trying to make chroma work with an embedding model endpoint that we're serving internally with vLLM (OpenAI proxy endpoint)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants