Hello memU, Can you plz provide the LLM backend with ollama or vLLM, so we can run this project using local server?
Hello memU,
Can you plz provide the LLM backend with ollama or vLLM, so we can run this project using local server?