Replies: 5 comments
-
|
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
Beta Was this translation helpful? Give feedback.
-
|
@ONLY-yours @sxjeru - This is a model parameter configuration issue. The error indicates that temperature and top_p cannot both be specified for a particular model. Please investigate the model settings validation logic. |
Beta Was this translation helpful? Give feedback.
-
|
This error happens because certain models—especially Claude 4+ models accessed via the OpenAI or Bedrock APIs—do not allow both temperature and top_p to be set at the same time. If both are included in the request, the API returns an error asking you to use only one parameter. This is a known issue and is tracked in the repo (see issue #9658). LobeChat has logic to handle this for most providers: for affected models (like claude-opus-4-1, claude-sonnet-4-5, etc.), if you set temperature, only temperature is sent; if you leave temperature unset, only top_p is sent (see PR #9627, PR #9476). However, if you use the OpenAI channel to call Claude models, this conflict can still occur because the OpenAI proxy does not resolve it. To avoid the error, use only one of the sliders (temperature or top_p) for these models. If you want to use both parameters, switch to a model/provider that supports them together. For Claude models, the recommended solution is to use the Anthropic channel directly instead of the OpenAI channel (details). Some OpenAI models (like gpt-4o-search-preview) do not support either parameter at all—so for those, both temperature and top_p should be left unset (see here). For more details and updates, you can follow the linked issues and PRs. To reply, just mention @dosu. How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other |
Beta Was this translation helpful? Give feedback.
-
|
Found 3 possible duplicate issues:
This issue will be automatically closed as a duplicate in 3 days.
|
Beta Was this translation helpful? Give feedback.
-
|
This is an issue from the upstream model provider. Some models indeed do not support specifying both parameters simultaneously.
Original Content这是上游模型提供商的问题,有些模型确实不支持同时指定这两个参数。 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
📱 Client Type
Web (Desktop Browser)
💻 Operating System
Ubuntu
📦 Deployment Platform
Other
🔧 Deployment Mode
server db (lobe-chat-database image)
📌 Version
v1.142.2
🌐 Browser
Chrome
🐛 Bug Description
the issue like. #9658
only close temperature it works
📷 Recurrence Steps
No response
🚦 Expected Behavior
No response
📝 Additional Information
No response
🛠️ Willing to Submit a PR?
None
✅ Validations
Beta Was this translation helpful? Give feedback.
All reactions