Preflight Checklist
Problem/Usecase Description
Feature Description
When configuring a custom AI provider, it would be helpful to automatically fetch and display available models from the provider's API.
Current Behavior
Users must manually type the model ID, which requires looking up documentation.
Benefits
- Reduces configuration errors
- Improves user experience
- Works with any OpenAI-compatible API
Proposed Solution
- Add a "Fetch Models" button next to the API endpoint field
- Call the provider's
/v1/models endpoint to get available models
- Display models in a searchable dropdown
- Allow users to select from the list instead of typing manually
Feature Type
New functionality
Additional Context
Most OpenAI-compatible providers (Ollama, OpenRouter, etc.) support the /v1/models endpoint.
Contribution
Preflight Checklist
Problem/Usecase Description
Feature Description
When configuring a custom AI provider, it would be helpful to automatically fetch and display available models from the provider's API.
Current Behavior
Users must manually type the model ID, which requires looking up documentation.
Benefits
Proposed Solution
/v1/modelsendpoint to get available modelsFeature Type
New functionality
Additional Context
Most OpenAI-compatible providers (Ollama, OpenRouter, etc.) support the
/v1/modelsendpoint.Contribution