Skip to content

Conversation

@devdattatalele
Copy link
Contributor

@devdattatalele devdattatalele commented Dec 1, 2025

Fixes #6497

The AI chat was timing out after 5 minutes for slow models like self-hosted Ollama. This was caused by a hardcoded 300-second timeout in the HTTP client.

This PR makes the timeout configurable via AI_REQUEST_TIMEOUT_SECONDS environment variable with a default of 1 hour. Values outside the valid range (1-86400 seconds) fall back to the default with a warning.

Changes:

Added configurable timeout in backend/windmill-api/src/ai.rs
Registered env var in backend/windmill-common/src/global_settings.rs
Replaced magic numbers with named constants
Improved error handling and logging
Note for deployment: If using NGINX or similar proxy, ensure proxy_read_timeout is set to at least the same value as the AI timeout to avoid premature disconnections.

Testing: Validated timeout parsing logic for valid/invalid inputs and confirmed backward compatibility.

Add AI_REQUEST_TIMEOUT_SECONDS environment variable (default 3600s)
to fix timeout issues with slow AI models like self-hosted Ollama.

Previously hardcoded at 300 seconds, causing legitimate long-running
requests to fail.

Fixes windmill-labs#6497
@github-actions
Copy link
Contributor

github-actions bot commented Dec 1, 2025

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@devdattatalele
Copy link
Contributor Author

I have read the CLA Document and I hereby sign the CLA

Add comprehensive documentation about reverse proxy timeout requirements.
Without proper NGINX/proxy configuration, connections will still timeout
at the proxy layer regardless of backend timeout settings.

Enhanced documentation includes:
- CRITICAL warning about proxy configuration requirement
- Example NGINX configuration snippet
- Explanation of proxy vs backend timeout interaction

This addresses the root cause in issue windmill-labs#6497 where logs showed
"upstream prematurely closed connection" indicating proxy-level timeout.

Part of windmill-labs#6497
github-actions bot added a commit that referenced this pull request Dec 2, 2025
@rubenfiszel rubenfiszel merged commit 764e1e1 into windmill-labs:main Dec 2, 2025
1 of 2 checks passed
@github-actions github-actions bot locked and limited conversation to collaborators Dec 2, 2025
@rubenfiszel
Copy link
Contributor

Thanks!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Bug: Windmill's AI Chat times out if the request takes more than 5 minutes when using customai resource

2 participants