Could we have updated documentation on the different configurations?
I'm trying to use this for the first time, not sure how it was before but... One good thing of versions is the immutability of them, it would be great to mention versions tests: CAI framework version + ollama version etc, so one can reproduce it locally when "latest" versions fail...
Tried to use a custom OPENAI_BASE_URL with OCI Generative AI (which provides litellm endpoint) and can't get it to work:
export OPENAI_API_KEY="..."
export OPENAI_BASE_URL="https://....oci.oraclecloud.com/.../litellm/v1"
export CAI_MODEL=openai/gpt-4.1
export CAI_PRICE_LIMIT=""
export CAI_DEBUG="1"
export PROMPT_TOOLKIT_NO_CPR=1
export CAI_STREAM=false
One thing took me a while was the openai/ prefix being required otherwise OPENAI_BASE_URL is just ignored. I'm not sure I found it in docs or in one of these issues or in code, after 1h trying to understand what could I be missing (from Quickstart docs)
Now I'm getting different error
litellm.exceptions.APIError: litellm.APIError: APIError: OpenAIException - 'str' object has no attribute 'model_dump'
Tried to use Ollama
export OPENAI_API_KEY="fake"
export OPENAI_BASE_URL=""
export CAI_MODEL=ollama/qwen2.5:7b
export OLLAMA=""
export OLLAMA_API_BASE=http://localhost:11434/v1
export CAI_PRICE_LIMIT="0"
export CAI_DEBUG="1"
export PROMPT_TOOLKIT_NO_CPR=1
export CAI_STREAM=false
Again ollama/ prefix not very clear in docs.
And this setup threw error:
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: OllamaException - 404 page not found
In ollama logs, /v1/api/show /v1/api/generate giving 404
Tried to use ollama as if it was not ollama
export OPENAI_API_KEY="..."
export OPENAI_BASE_URL="http://127.0.0.1:11434/v1"
export OLLAMA_API_BASE=""
export CAI_MODEL=ollama/qwen2.5:7b
export CAI_PRICE_LIMIT=""
export CAI_DEBUG="1"
export PROMPT_TOOLKIT_NO_CPR=1
export CAI_STREAM=false
This is what finally got CAI going for me.
However, I still need to search the docs to understand why typing hello in the TUI makes it start a random CTF exercise of analyzing a non-existing linux binary....
on releases
Last tag I see is 0.5.9 but pip installed me 0.5.10
None CLI flags existing, unlike what is described in the docs...
Could we have updated documentation on the different configurations?
I'm trying to use this for the first time, not sure how it was before but... One good thing of versions is the immutability of them, it would be great to mention versions tests: CAI framework version + ollama version etc, so one can reproduce it locally when "latest" versions fail...
Tried to use a custom OPENAI_BASE_URL with OCI Generative AI (which provides litellm endpoint) and can't get it to work:
One thing took me a while was the
openai/prefix being required otherwise OPENAI_BASE_URL is just ignored. I'm not sure I found it in docs or in one of these issues or in code, after 1h trying to understand what could I be missing (from Quickstart docs)Now I'm getting different error
Tried to use Ollama
Again
ollama/prefix not very clear in docs.And this setup threw error:
Tried to use ollama as if it was not ollama
This is what finally got CAI going for me.
However, I still need to search the docs to understand why typing
helloin the TUI makes it start a random CTF exercise of analyzing a non-existing linux binary....on releases
Last tag I see is 0.5.9 but pip installed me 0.5.10
None CLI flags existing, unlike what is described in the docs...