### Describe the need of your request Is it possible to use for example opan ai compatable server for chat but for auto code completitions use local llm ? ### Proposed solution _No response_ ### Additional context _No response_