Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
448aa7f
feat: Add GoogleVertexAIConnector with settings validation and stubbe…
hxcva1 Sep 16, 2025
389481a
feat: Add GoogleVertexAIConnector to LanguageModelConnector factory
hxcva1 Sep 17, 2025
12d2bd6
feat: Add Google Vertex AI parameters and environment variables to in…
hxcva1 Sep 17, 2025
0be2a59
feat: Implement GoogleVertexAIConnector using GeminiChatClient
hxcva1 Sep 23, 2025
4c28c4c
test: Add unit tests for GoogleVertexAIConnector
hxcva1 Sep 23, 2025
319f6be
test: Fix failing tests for GoogleVertexAIConnector when model is mis…
hxcva1 Sep 23, 2025
4d05176
chore: Resolve conflicts between main.bicep and main.parameters.json
hxcva1 Sep 26, 2025
2dff61a
refactor: Replace magic strings with private const string in GoogleVe…
hxcva1 Sep 26, 2025
274a22a
Merge branch 'main' into feature/237-connector-impl-google-vertex-ai
hxcva1 Oct 8, 2025
d2e7f6d
test: Add new test cases to GoogleVertexAIConnectorTests
hxcva1 Oct 8, 2025
148ba33
test: Enable GoogleVertexAIConnector inheritance check in LanguageMod…
hxcva1 Oct 10, 2025
282cc6f
refactor: Remove unnecessary function parameter
hxcva1 Oct 13, 2025
e87a218
docs: Add Google Vertex AI (draft)
hxcva1 Oct 15, 2025
1c620b4
Merge branch 'main' into feature/237-connector-impl-google-vertex-ai
hxcva1 Oct 18, 2025
0c4314f
test: Add unit tests for GoogleVertexAIConnector (validation and clie…
hxcva1 Oct 21, 2025
4b57d77
fix: Resolve conflicts and add GoogleVertexAI parameters
hxcva1 Oct 27, 2025
a024cb7
test: Remove GoogleVertexAI from unsupported connector tests
hxcva1 Oct 27, 2025
97b8a06
Update google-vertex-ai.md
tae0y Oct 29, 2025
0c2dc78
Update google-vertex-ai.md
tae0y Oct 29, 2025
f6cb575
Update README.md
tae0y Oct 29, 2025
6159b2e
Update GoogleVertexAIConnectorType.cs
tae0y Oct 29, 2025
5f8fe32
Update GoogleVertexAIConnector.cs
tae0y Oct 29, 2025
e73b518
Update GoogleVertexAIConnector.cs
tae0y Oct 29, 2025
757ca03
Update GoogleVertexAIConnectorTests.cs
tae0y Oct 29, 2025
09c8eb5
Merge branch 'main' into feature/237-connector-impl-google-vertex-ai
tae0y Oct 29, 2025
12a78cb
Update resources.bicep
tae0y Oct 29, 2025
9af34a4
Update resources.bicep
tae0y Oct 29, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
- [x] [Amazon Bedrock](https://docs.aws.amazon.com/bedrock)
- [x] [Azure AI Foundry](https://learn.microsoft.com/azure/ai-foundry/what-is-azure-ai-foundry)
- [x] [GitHub Models](https://docs.github.com/github-models/about-github-models)
- [ ] [Google Vertex AI](https://cloud.google.com/vertex-ai/docs)
- [x] [Google Vertex AI](https://cloud.google.com/vertex-ai/docs)
- [x] [Docker Model Runner](https://docs.docker.com/ai/model-runner)
- [x] [Foundry Local](https://learn.microsoft.com/azure/ai-foundry/foundry-local/what-is-foundry-local)
- [x] [Hugging Face](https://huggingface.co/docs)
Expand Down Expand Up @@ -63,6 +63,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
- [Use Amazon Bedrock](./docs/amazon-bedrock.md#run-on-local-machine)
- [Use Azure AI Foundry](./docs/azure-ai-foundry.md#run-on-local-machine)
- [Use GitHub Models](./docs/github-models.md#run-on-local-machine)
- [Google Vertex AI](./docs/google-vertex-ai.md#run-on-local-machine)
- [Use Docker Model Runner](./docs/docker-model-runner.md#run-on-local-machine)
- [Use Foundry Local](./docs/foundry-local.md#run-on-local-machine)
- [Use Hugging Face](./docs/hugging-face.md#run-on-local-machine)
Expand All @@ -77,6 +78,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
- [Use Amazon Bedrock](./docs/amazon-bedrock.md#run-in-local-container)
- [Use Azure AI Foundry](./docs/azure-ai-foundry.md#run-in-local-container)
- [Use GitHub Models](./docs/github-models.md#run-in-local-container)
- [Google Vertex AI](./docs/google-vertex-ai.md#run-on-local-container)
- [Use Docker Model Runner](./docs/docker-model-runner.md#run-in-local-container)
- ~~Use Foundry Local~~ 👉 NOT SUPPORTED
- [Use Hugging Face](./docs/hugging-face.md#run-in-local-container)
Expand All @@ -91,6 +93,7 @@ Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM
- [Use Amazon Bedrock](./docs/amazon-bedrock.md#run-on-azure)
- [Use Azure AI Foundry](./docs/azure-ai-foundry.md#run-on-azure)
- [Use GitHub Models](./docs/github-models.md#run-on-azure)
- [Google Vertex AI](./docs/google-vertex-ai.md#run-on-azure)
- ~~Use Docker Model Runner~~ 👉 NOT SUPPORTED
- ~~Use Foundry Local~~ 👉 NOT SUPPORTED
- [Use Hugging Face](./docs/hugging-face.md#run-on-azure)
Expand Down
1 change: 1 addition & 0 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
- [Amazon Bedrock](./amazon-bedrock.md)
- [Azure AI Foundry](./azure-ai-foundry.md)
- [GitHub Models](./github-models.md)
- [Google Vertex AI](./google-vertex-ai.md)
- [Docker Model Runner](./docker-model-runner.md)
- [Foundry Local](./foundry-local.md)
- [Hugging Face](./hugging-face.md)
Expand Down
241 changes: 241 additions & 0 deletions docs/google-vertex-ai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,241 @@
# OpenChat Playground with Google Vertex AI

This page describes how to run OpenChat Playground (OCP) with Google Vertex AI integration.

## Get the repository root

1. Get the repository root.

```bash
# bash/zsh
REPOSITORY_ROOT=$(git rev-parse --show-toplevel)
```

```powershell
# PowerShell
$REPOSITORY_ROOT = git rev-parse --show-toplevel
```

## Run on local machine

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Add Google Vertex AI API Key. Replace `{{GOOGLE_VERTEX_AI_API_KEY}}` with your key.

```bash
# bash/zsh
dotnet user-secrets --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp \
set GoogleVertexAI:ApiKey "{{GOOGLE_VERTEX_AI_API_KEY}}"
```

```powershell
# PowerShell
dotnet user-secrets --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp `
set GoogleVertexAI:ApiKey "{{GOOGLE_VERTEX_AI_API_KEY}}"
```

> To get an API Key, refer to the doc [Using Gemini API keys](https://ai.google.dev/gemini-api/docs/api-key#api-keys).

1. Run the app. The default model OCP uses is [Gemini 2.5 Flash Lite](https://ai.google.dev/gemini-api/docs/models#gemini-2.5-flash-lite).

```bash
# bash/zsh
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- \
--connector-type GoogleVertexAI
```

```powershell
# PowerShell
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- `
--connector-type GoogleVertexAI
```

Alternatively, if you want to run with a different deployment, say [`gemini-2.5-pro`](https://ai.google.dev/gemini-api/docs/models#gemini-2.5-pro), other than the default one, you can specify it as an argument.

```bash
# bash/zsh
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- \
--connector-type GoogleVertexAI \
--model gemini-2.5-pro
```

```powershell
# PowerShell
dotnet run --project $REPOSITORY_ROOT/src/OpenChat.PlaygroundApp -- `
--connector-type GoogleVertexAI `
--model gemini-2.5-pro
```

1. Open your web browser at `http://localhost:5280` and start entering prompts.

## Run in local container

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Build a container.

```bash
docker build -f Dockerfile -t openchat-playground:latest .
```

1. Get the Google Vertex AI key.

```bash
API_KEY=$(dotnet user-secrets --project ./src/OpenChat.PlaygroundApp list --json | \
sed -n '/^\/\//d; p' | jq -r '."GoogleVertexAI:ApiKey"')
```

```powershell
# PowerShell
$API_KEY = (dotnet user-secrets --project ./src/OpenChat.PlaygroundApp list --json | `
Select-String -NotMatch '^//(BEGIN|END)' | ConvertFrom-Json).'GoogleVertexAI:ApiKey'
```

1. Run the app. The default model OCP uses is [Gemini 2.5 Flash Lite](https://ai.google.dev/gemini-api/docs/models#gemini-2.5-flash-lite).

```bash
# bash/zsh - from locally built container
docker run -i --rm -p 8080:8080 openchat-playground:latest \
--connector-type GoogleVertexAI \
--api-key $API_KEY
```

```powershell
# PowerShell - from locally built container
docker run -i --rm -p 8080:8080 openchat-playground:latest --connector-type GoogleVertexAI `
--api-key $API_KEY
```

```bash
# bash/zsh - from GitHub Container Registry
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest \
--connector-type GoogleVertexAI \
--api-key $API_KEY
```

```powershell
# PowerShell - from GitHub Container Registry
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest `
--connector-type GoogleVertexAI `
--api-key $API_KEY `
```

Alternatively, if you want to run with a different deployment, say [`gemini-2.5-pro`](https://ai.google.dev/gemini-api/docs/models#gemini-2.5-pro), other than the default one, you can specify it as an argument.

```bash
# bash/zsh - from locally built container
docker run -i --rm -p 8080:8080 openchat-playground:latest \
--connector-type GoogleVertexAI \
--api-key $API_KEY \
--model gemini-2.5-pro
```

```powershell
# PowerShell - from locally built container
docker run -i --rm -p 8080:8080 openchat-playground:latest --connector-type GoogleVertexAI `
--api-key $API_KEY `
--model gemini-2.5-pro
```

```bash
# bash/zsh - from GitHub Container Registry
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest \
--connector-type GoogleVertexAI \
--api-key $API_KEY \
--model gemini-2.5-pro
```

```powershell
# PowerShell - from GitHub Container Registry
docker run -i --rm -p 8080:8080 ghcr.io/aliencube/open-chat-playground/openchat-playground:latest `
--connector-type GoogleVertexAI `
--api-key $API_KEY `
--model gemini-2.5-pro
```

1. Open your web browser, navigate to `http://localhost:8080`, and enter prompts.

## Run on Azure

1. Make sure you are at the repository root.

```bash
cd $REPOSITORY_ROOT
```

1. Login to Azure:

```bash
azd auth login
```

1. Check login status.

```bash
azd auth login --check-status
```

1. Initialize `azd` template.

```bash
azd init
```

> **NOTE**: You will be asked to provide environment name for provisioning.

1. Get Google Vertex AI API Key.

```bash
API_KEY=$(dotnet user-secrets --project ./src/OpenChat.PlaygroundApp list --json | \
sed -n '/^\/\//d; p' | jq -r '."GoogleVertexAI:ApiKey"')
```

```powershell
# PowerShell
$API_KEY = (dotnet user-secrets --project ./src/OpenChat.PlaygroundApp list --json | `
Select-String -NotMatch '^//(BEGIN|END)' | ConvertFrom-Json).'GoogleVertexAI:ApiKey'
```

1. Set Google Vertex AI configuration to azd environment variables.

```bash
azd env set GOOGLE_VERTEX_AI_API_KEY $API_KEY
```

The default model OCP uses is [Gemini 2.5 Flash Lite](https://ai.google.dev/gemini-api/docs/models#gemini-2.5-flash-lite). If you want to run with a different deployment, say [`gemini-2.5-pro`](https://ai.google.dev/gemini-api/docs/models#gemini-2.5-pro), other than the default one, add it to azd environment variables.

```bash
azd env set GOOGLE_VERTEX_AI_MODEL gemini-2.5-pro
```

1. Set the connector type to `GoogleVertexAI`

```bash
azd env set CONNECTOR_TYPE GoogleVertexAI
```

1. Provision and deploy:

```bash
azd up
```

> **NOTE**: You will be asked to provide Azure subscription and location for deployment.

Once deployed, you will be able to see the deployed OCP app URL.

1. Open your web browser, navigate to the OCP app URL, and enter prompts.

1. Clean up:

```bash
azd down --force --purge
```
5 changes: 5 additions & 0 deletions infra/main.bicep
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,9 @@ param azureAIFoundryDeploymentName string = ''
param githubModelsToken string = ''
param githubModelsModel string = ''
// Google Vertex AI
@secure()
param googleVertexAIModel string = ''
param googleVertexAIApiKey string = ''
// Docker Model Runner - NOT SUPPORTED
// Foundry Local - NOT SUPPORTED
// Hugging Face
Expand Down Expand Up @@ -103,6 +106,8 @@ module resources 'resources.bicep' = {
huggingFaceModel: huggingFaceModel
githubModelsToken: githubModelsToken
githubModelsModel: githubModelsModel
googleVertexAIModel: googleVertexAIModel
googleVertexAIApiKey: googleVertexAIApiKey
ollamaModel: ollamaModel
anthropicModel: anthropicModel
anthropicApiKey: anthropicApiKey
Expand Down
6 changes: 6 additions & 0 deletions infra/main.parameters.json
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,12 @@
"githubModelsModel": {
"value": "${GH_MODELS_MODEL=openai/gpt-4o-mini}"
},
"googleVertexAIModel": {
"value": "${GOOGLE_VERTEX_AI_MODEL}"
},
"googleVertexAIApiKey": {
"value": "${GOOGLE_VERTEX_AI_API_KEY}"
},
"huggingFaceModel": {
"value": "${HUGGING_FACE_MODEL=hf.co/Qwen/Qwen3-0.6B-GGUF}"
},
Expand Down
22 changes: 21 additions & 1 deletion infra/resources.bicep
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,9 @@ param azureAIFoundryDeploymentName string = ''
param githubModelsToken string = ''
param githubModelsModel string = ''
// Google Vertex AI
@secure()
param googleVertexAIModel string = ''
param googleVertexAIApiKey string = ''
// Docker Model Runner - NOT SUPPORTED
// Foundry Local - NOT SUPPORTED
// Hugging Face
Expand Down Expand Up @@ -246,6 +249,17 @@ var envGitHubModels = (connectorType == '' || connectorType == 'GitHubModels') ?
}
] : []) : []
// Google Vertex AI
var envGoogleVertexAI = connectorType == 'GoogleVertexAI' ? concat(googleVertexAIModel != '' ? [
{
name: 'GoogleVertexAI__Model'
value: googleVertexAIModel
}
] : [], googleVertexAIApiKey != '' ? [
{
name: 'GoogleVertexAI__ApiKey'
secretRef: 'google-vertex-ai-api-key'
}
] : []) : []
// Docker Model Runner - NOT SUPPORTED
// Foundry Local - NOT SUPPORTED
// Hugging Face
Expand Down Expand Up @@ -348,6 +362,11 @@ module openchatPlaygroundApp 'br/public:avm/res/app/container-app:0.18.1' = {
name: 'github-models-token'
value: githubModelsToken
}
] : [], googleVertexAIApiKey != '' ? [
{
name: 'google-vertex-ai-api-key'
value: googleVertexAIApiKey
}
] : [], anthropicApiKey != '' ? [
{
name: 'anthropic-api-key'
Expand Down Expand Up @@ -390,6 +409,7 @@ module openchatPlaygroundApp 'br/public:avm/res/app/container-app:0.18.1' = {
envAmazonBedrock,
envAzureAIFoundry,
envGitHubModels,
envGoogleVertexAI,
envHuggingFace,
envOllama,
envAnthropic,
Expand Down Expand Up @@ -486,4 +506,4 @@ module ollama 'br/public:avm/res/app/container-app:0.18.1' = if (useOllama == tr
}

output AZURE_CONTAINER_REGISTRY_ENDPOINT string = containerRegistry.outputs.loginServer
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundApp.outputs.resourceId
output AZURE_RESOURCE_OPENCHAT_PLAYGROUNDAPP_ID string = openchatPlaygroundApp.outputs.resourceId
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ public static async Task<IChatClient> CreateChatClientAsync(AppSettings settings
ConnectorType.AmazonBedrock => new AmazonBedrockConnector(settings),
ConnectorType.AzureAIFoundry => new AzureAIFoundryConnector(settings),
ConnectorType.GitHubModels => new GitHubModelsConnector(settings),
ConnectorType.GoogleVertexAI => new GoogleVertexAIConnector(settings),
ConnectorType.DockerModelRunner => new DockerModelRunnerConnector(settings),
ConnectorType.FoundryLocal => new FoundryLocalConnector(settings),
ConnectorType.HuggingFace => new HuggingFaceConnector(settings),
Expand Down
Loading