Skip to content

Conversation

@dheerajoruganty
Copy link
Contributor

This PR adds LiteLLM support for VLLM and Ollama.

Tested this with Ollama and VLLM for now.

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant