Skip to content

Use different models for chat and code completitions #1025

Open
@Prikalel

Description

@Prikalel

Describe the need of your request

Is it possible to use for example opan ai compatable server for chat
but for auto code completitions use local llm ?

Proposed solution

No response

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions