fix: transform tool output content types for Responses API bridge #17543
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Relevant issues
Fixes #17507
Pre-Submission checklist
tests/litellm/directorymake test-unitType
🐛 Bug Fix
Description
When using
litellm.completion()with models that only support the Responses API (gpt-5.1-codex,o1-pro,o3-pro, etc.), LiteLLM automatically transforms the request from Chat Completions format to Responses API format.The Responses API expects tool results in this format:
But we were sending Chat Completions format:
Fix
Updated
litellm/completion_extras/litellm_responses_transformation/transformation.pyto transform tool output content when it's a list:{"type": "text"}→{"type": "output_text"}{"type": "image_url", "image_url": {"url": "..."}}→{"type": "input_image", "image_url": "..."}Tests
test_convert_tool_output_to_responses_formattest_convert_tool_output_string_content_unchanged