How to specify api base for Guardrails server? #1306
Unanswered
grudloffev
asked this question in
Q&A
Replies: 1 comment
-
@grudloffev exeactly same doubt. also just why even we need a server? thats my quetion cant we directly use this gurdrails as a normal function?? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
All the examples show how to set up the server by creating a
config.py
file and setting the openAI token through an environment variable (see for instance https://www.guardrailsai.com/docs/getting_started/guardrails_server). Although it is kind of clear how I could manage with other providers, it is not clear how to do so with a self hosted instance. The thing is that with OpenAI you do not specify the api base url, as the OpenAI endpoint is picket up automatically because the OpenAI env var was set up, so it's not clear how to do so generally.The following is what I have locally that want to replicate from the server side:
In short, how would I set the api base from the guardrails server so that LitteLLM picks it up?
Beta Was this translation helpful? Give feedback.
All reactions