Skip to content

Conversation

@dariuszkowalski-com
Copy link
Contributor

Pull Request: Fix custom providers functionality (#1816)

Summary

Resolves the fundamental issue where custom providers defined in configuration files were not appearing in the CLI provider selection menu, making them completely unusable despite being properly configured.

Problem Analysis

The root cause was architectural - the ProviderId enum was static and compile-time only, with hardcoded variants like OpenAI, Anthropic, etc. Runtime-defined providers (custom providers) had no representation in this system, causing them to be filtered out during provider enumeration.

Solution Design

Implemented a hybrid ProviderId system that supports both built-in and custom providers:

pub enum ProviderId {
    BuiltIn(BuiltInProviderId),  // Type-safe built-in providers
    Custom(String),               // Runtime-defined custom providers
}

Implementation Details

Core Architecture Changes

  • ProviderId Refactoring (crates/forge_domain/src/provider.rs):
    • Split static enum into hybrid system with BuiltInProviderId + Custom(String)
    • Added helper methods (is_openai(), is_anthropic(), etc.) for type-safe checking
    • Implemented custom Serialize/Deserialize for backward compatibility

UI/UX Improvements

  • Smart URL Display (crates/forge_main/src/model.rs):
    • Shows non-standard ports only (hide 80/443 for cleaner interface)
    • Proper IP address handling with port visibility
    • Visual indicators for configured vs unconfigured providers

Documentation & Templates

  • Comprehensive Examples (.forge/ directory):
    • provider-template.json - Detailed template with 4 provider types
    • provider-example-vllm.json - Ready-to-use VLLM configuration
    • provider-example-ollama.json - Ollama setup with hardcoded models
    • README.md - Complete setup guide and troubleshooting

Key Decisions & Rationale

Why Hybrid System vs Dynamic Only?

  • Type Safety: Built-in providers remain compile-time checked
  • Backward Compatibility: Existing code continues to work unchanged
  • Performance: No runtime overhead for built-in provider operations

Why Custom Serialize/Deserialize?

  • Format Consistency: Maintains string-based JSON format ("vllm_local" vs {"Custom": "vllm_local"})
  • Migration Safety: Existing configuration files remain valid

Why Helper Methods Instead of Pattern Matching?

  • Maintainability: Centralized provider identification logic
  • Extensibility: Easy to add new provider types without updating all match statements
  • Error Prevention: Compile-time guarantees for built-in provider checks

Testing

  • 7 New Unit Tests: Complete coverage for provider display functionality
  • ANSI Color Handling: Proper test infrastructure for colored output
  • Edge Cases: IP addresses, standard/non-standard ports, template providers

Impact

  • Files Changed: 19 files (removed .gitignore and CUSTOM_PROVIDERS_PLAN.md as requested)
  • Lines Added: +961 / -233
  • Test Coverage: 1344 tests passing (0 failures)
  • Breaking Changes: None (fully backward compatible)

Verification

  1. Custom providers appear in /provider selection menu
  2. URL display shows ports correctly (non-standard only)
  3. Configuration and usage flow works end-to-end
  4. All existing functionality preserved

Ready for code review. Please test with custom provider configuration to verify the fix resolves issue #1816.

Co-Authored-By: ForgeCode [email protected]

@github-actions github-actions bot added the type: feature Brand new functionality, features, pages, workflows, endpoints, etc. label Nov 5, 2025
@dariuszkowalski-com dariuszkowalski-com changed the title feat: implement hybrid ProviderId system to support custom providers (#1816) feat: implement hybrid ProviderId system to support custom providers Nov 5, 2025
@dariuszkowalski-com
Copy link
Contributor Author

🎨 Code Formatting Update

Applied code formatting fixes to ensure compliance with rustfmt standards:

  • Reordered imports in and
  • Removed duplicate imports
  • Fixed line wrapping for long comments
  • All formatting checks now pass

Co-Authored-By: ForgeCode [email protected]

@amitksingh1490
Copy link
Contributor

@dariuszkowalski-com #1875 can this solve the problem?

@dariuszkowalski-com dariuszkowalski-com force-pushed the fix/custom-providers-hybrid-id-system branch from 1fdfd07 to 3dbaafd Compare November 9, 2025 22:57
@dariuszkowalski-com dariuszkowalski-com marked this pull request as draft November 9, 2025 23:13
- Replace static ProviderId enum with flexible hybrid system supporting both built-in and custom providers
- Add BuiltInProviderId enum for type-safe built-in provider identification
- Add Custom(String) variant for runtime-defined custom providers
- Implement comprehensive helper methods for provider identification and creation
- Add custom Serialize/Deserialize implementations for backward compatibility
- Update all ProviderId usage across codebase to use helper methods
- Fix move/borrow checker errors by adding proper .clone() calls
- Add comprehensive test coverage for new ProviderId functionality
- Resolve issue antinomyhq#1816: custom providers now visible and selectable in UI

This change enables users to define custom providers in provider.json configuration
and have them appear in provider selection menu as fully functional options.

Co-Authored-By: ForgeCode <[email protected]>
@dariuszkowalski-com dariuszkowalski-com force-pushed the fix/custom-providers-hybrid-id-system branch from 3dbaafd to 54cf67e Compare November 9, 2025 23:15
@dariuszkowalski-com
Copy link
Contributor Author

@dariuszkowalski-com #1875 can this solve the problem?

No, it do not resolve issue.

Here how forge see the problem:

Analysis of Custom Provider Issue #1816 and Proposed Solutions

Problem (#1816)

The issue is that custom providers are non-functional despite proper implementation of the loading mechanism. The root cause is a static ProviderId enum that only includes pre-defined providers.

Branch fix/provider-display-name - PR #1875

PR #1875 refactors ProviderId from enum to struct ProviderId(&'static str) with const constructors. However:

Does NOT solve issue #1816 because:

  1. Deserialization still validates only built-in providers (crates/forge_domain/src/provider.rs:145-148)
  2. get_valid_provider_names() returns only built-in providers (crates/forge_main/src/ui.rs:2424-2429)
  3. CLI still rejects custom providers with "Invalid provider" error

PR #1847 - Hybrid Solution

PR #1847 implements a hybrid system:

  • enum ProviderId { BuiltIn(BuiltInProviderId), Custom(String) }
  • Full support for custom providers in UI and CLI
  • Proper deserialization accepting custom providers

Recommendation

PR #1875 should be merged with PR #1847, because:

  1. The struct-based ProviderId concept from PR refactor: convert ProviderId from enum to struct with const constructors #1875 is good - const constructors are more ergonomic than enum
  2. The implementation from PR feat: implement hybrid ProviderId system to support custom providers #1847 is necessary - proper deserialization and validation of custom providers

@amitksingh1490
Copy link
Contributor

the issue you mentioned Deserialization still validates only built-in providers (crates/forge_domain/src/provider.rs:145-148) is solved in current implementation, take a pull and try

@dariuszkowalski-com
Copy link
Contributor Author

the issue you mentioned Deserialization still validates only built-in providers (crates/forge_domain/src/provider.rs:145-148) is solved in current implementation, take a pull and try

I have tried and i can not add my custom provider to forge

I have got this provider.json in ~/forge/

[                                                            
  {                                                          
    "id": "vllm_local",                                      
    "api_key_vars": "VLLM_LOCAL_API_KEY",                    
    "url_param_vars": ["VLLM_LOCAL_URL"],                    
    "response_type": "OpenAI",                               
    "url": "{{VLLM_LOCAL_URL}}/v1/chat/completions",         
    "models": "{{VLLM_LOCAL_URL}}/v1/models",
    "auth_methods": ["api_key"]                                                    
  }                                                          
]

It works with my implementation

BTW. I have added also different way of showing urls.
I add showing :PORT_NUMER for all not standard options.
This is a case for local LLM'a and base on it we can see which server we are conneted.

- Replace problematic 'result.is_err() || true' assertion with proper error handling
- Test now verifies authentication fails with appropriate error (not verifier-related)
- Maintains original test intent while satisfying clippy requirements

Co-Authored-By: ForgeCode <[email protected]>
@dariuszkowalski-com
Copy link
Contributor Author

dariuszkowalski-com commented Nov 10, 2025

Ok, now It works.

Some issues:

  • I see unavailable even if LLM works:
-   ✓ VllmLocal           [unavailable]

this is resolved in my PR

  • during an onboarding I enter bad url http://27.0.0.1:8000. Forge can't load, and to repair it i needed to:
   forge provider logout <- new provider
   forge provider login <- new provider

This is not obvious for users

  • when I remove vllm_local from provider.json - when I have got set it as my main provider it stop working and I got:
⏺ [23:47:57] ERROR: Failed to fetch models

Caused by:
    Provider LlamaCpp is not available via environment configuration    

Forge 1.3.0 show error:


⏺ [00:04:53] ERROR: Failed to fetch the models

Caused by:
    0: GET http://192.168.1.10:8000/v1/models
    1: GET http://192.168.1.10:8000/v1/models
    2: error sending request for url (http://192.168.1.10:8000/v1/models)
    3: client error (Connect)
    4: tcp connect error
    5: Connection refused (os error 111)   

In that situation, only recreate a configuration helped.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

type: feature Brand new functionality, features, pages, workflows, endpoints, etc.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants