Skip to content

Conversation

@s0up
Copy link

@s0up s0up commented Sep 16, 2025

No description provided.

… use max_completion_tokens via extra_body; hide unsupported options in UI; improve length handling and reduce overhead
Copy link

@boxman0617 boxman0617 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is exactly what I was looking for! LGTM! Thanks random stranger!!!

Copy link
Owner

@jekalmin jekalmin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your work!
It would be a great idea if supported parameters of models are managed in yaml and in sync with options.

Comment on lines +416 to +418
tokens_value = min(max(tokens_value * 2, observed + 256), max_cap)
if attempt < 3:
continue
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you explain this in detail?
I want to know about the reasons behind retrying even if max_token is reached.

if _SUPPORTED_PARAMS_SPEC is not None:
return _SUPPORTED_PARAMS_SPEC

spec_path = os.path.join(os.path.dirname(__file__), "supported_parameters.yaml")
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you resolve this warning?

WARNING (MainThread) [homeassistant.util.loop] Detected blocking call to open with args ('/workspaces/home-assistant-core/config/custom_components/extended_openai_conversation/supported_parameters.yaml', 'r')

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants