-
Notifications
You must be signed in to change notification settings - Fork 25
Open
Description
I get the following issue at the front end (hosting on a VPS, not locally, which has other containers for other services and thus a shared network - Nginx Proxy Manager):
An error occurred. Either the engine you requested does not exist or there was another issue processing your request. If this issue persists please contact us through our help center at help.openai.com.
docker-compose.yml
version: '3'
services:
chatgpt-client:
image: soulteary/chatgpt
restart: always
environment:
APP_PORT: 8090
# the ChatGPT client domain, keep the same with chatgpt-client: `APP_HOSTNAME` option
APP_HOSTNAME: "http://localhost:8090"
# the ChatGPT backend upstream, or connect a sparrow dev server `"http://host.docker.internal:8091"`
APP_UPSTREAM: "http://sparrow:8091"
networks:
- nginxproxymanager_default
sparrow:
image: soulteary/sparrow
restart: always
environment:
# [Basic Settings]
# => The ChatGPT Web Client Domain
WEB_CLIENT_HOSTNAME: "http://chatgpt-client:8090"
# => Service port, default: 8091
# APP_PORT: 8091
# [Private OpenAI API Server Settings] *optional
# => Enable OpenAI 3.5 API
ENABLE_OPENAI_API: "on"
# => OpenAI API Key
OPENAI_API_KEY: "sk-iTsAsEcReT"
# => Enable OpenAI API Proxy
# OPENAI_API_PROXY_ENABLE: "on"
# => OpenAI API Proxy Address, eg: `"http://127.0.0.1:1234"` or ""
# OPENAI_API_PROXY_ADDR: "http://127.0.0.1:1234"
logging:
driver: "json-file"
options:
max-size: "10m"
networks:
- nginxproxymanager_default
networks:
nginxproxymanager_default:
external: true
maxsyst
Metadata
Metadata
Assignees
Labels
No labels