llama-swap (llama.cpp) provider error: API request failed: tools param requires --jinja flag #39074
Replies: 4 comments 6 replies
-
hi you can either start your server with jinja to have tools set ./server --jinja ... or {
"language_models": {
"openai_compatible": {
"Mac Studio llama-swap": {
"api_url": "http://<url>/v1",
"available_models": [
{
"name": "qwen3-coder:a3b",
"display_name": null,
"max_tokens": 128000,
"max_output_tokens": 32000,
"max_completion_tokens": 128000,
"capabilities": {
"tools": false, // <--- Disable tools here
"images": false,
"parallel_tool_calls": false,
"prompt_cache_key": false
}
}
]
}
}
}
} or can you provide the setup for your server as well thanks! |
Beta Was this translation helpful? Give feedback.
-
Hey, @coleleavitt I'm not OP, but I had the same issue - that's how I found this thread. Your provided repo - works. I am able to do completions that way. |
Beta Was this translation helpful? Give feedback.
-
@coleleavitt New discussion is at #39126 |
Beta Was this translation helpful? Give feedback.
-
Thanks @coleleavitt do you still need me to test out the fork or is it already in a PR? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I setup a llama-swap (OpenAI API compatible) endpoint as follows:
I am getting the error:
API request to http://192.168.86.215:9292/v1 failed: tools param requires --jinja flag
It occurs even if i toggle the tool param to be false.
Beta Was this translation helpful? Give feedback.
All reactions