Add bedrock to PROVIDER_MODELS_PREFIXES so AWS credential auth works#1951
Open
SohamKukreti wants to merge 1 commit intodevelopfrom
Open
Add bedrock to PROVIDER_MODELS_PREFIXES so AWS credential auth works#1951SohamKukreti wants to merge 1 commit intodevelopfrom
SohamKukreti wants to merge 1 commit intodevelopfrom
Conversation
…orks LLMConfig.__init__ checks PROVIDER_MODELS_PREFIXES when api_token=None. If the provider prefix isn't found, it silently falls through to the else branch and overwrites self.provider with DEFAULT_PROVIDER (openai/gpt-4o), meaning any bedrock/* model string was being replaced before the LLM call was even made. This broke supported Bedrock auth methods when api_token is not passed in the LLMConfig Only passing api_token=<bearer_token> explicitly worked, because the truthy api_token bypassed the prefix check entirely. Adding "bedrock": None to PROVIDER_MODELS_PREFIXES keeps self.provider intact so the correct Bedrock provider is used. The actual auth (SigV4 signing or Bearer header) is handled downstream based on what credentials are available in the environment.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
LLMConfig.__init__checks PROVIDER_MODELS_PREFIXES whenapi_token=None.If the provider prefix isn't found, it silently falls through to the else
branch and overwrites self.provider with DEFAULT_PROVIDER (
openai/gpt-4o),meaning any bedrock/* model string was being replaced before the LLM call was even made.
This broke supported Bedrock auth methods when api_token is not passed in the
LLMConfigOnly passing
api_token=<bearer_token>explicitly worked, because thetruthy api_token bypassed the prefix check entirely.
Adding "bedrock": None to PROVIDER_MODELS_PREFIXES keeps
self.provider intact so the correct Bedrock provider is used. The actual
auth (SigV4 signing or Bearer header) is handled downstream based on what credentials are
available in the environment.
List of files changed and why
crawl4ai/config.py: To add bedrock to PROVIDER_MODELS_PREFIXES for the fix.
How Has This Been Tested?
Tried LLMExtraction with Bedrock Auth Methods.
Checklist: