Add Ollama provider for local, API-key-free code reviews#89
Conversation
- Add `ollama` as 6th provider to PROVIDER_CONFIGS (auth_type: none) and SUGGESTED_MODELS - Introduce NO_AUTH_PROVIDERS set in auth.py; expand SUPPORTED_PROVIDERS to include ollama - Add get_ollama_base_url() to resolve OLLAMA_BASE_URL env var (default: http://localhost:11434) - Update validate_api_key() to ping /api/tags for Ollama (no key required) - Update _build_headers() to skip auth header when auth_type is "none" - Add _extract_content() case for Ollama response format (message.content) - Add _get_ollama_models() to auto-detect locally installed models via /api/tags - Update review_code_with_model() to resolve Ollama base_url dynamically - Update review_code() to auto-detect models and show friendly error when Ollama not running - Update iara init wizard: ollama skips API key step, shows available models, install instructions - Add Ollama section to docs/configuration.md (install, hardware requirements, troubleshooting) - Update README.md provider list and privacy comparison table OpenSpec change: openspec/changes/add-ollama-provider/ Closes #76 https://claude.ai/code/session_01Qq6RQNZHureRTZstKvBT8s
- cli.py: skip API key guard for NO_AUTH_PROVIDERS (Ollama) - auth_status.py: show Ollama connectivity status instead of key info https://claude.ai/code/session_01Qq6RQNZHureRTZstKvBT8s
Codecov Report❌ Patch coverage is
📢 Thoughts on this report? Let us know! |
|
|
||
| # Ollama: validate by pinging /api/tags (no API key needed) | ||
| if provider == "ollama": | ||
| base_url = get_ollama_base_url() |
There was a problem hiding this comment.
💡 Consider adding a timeout parameter to make it configurable. The hardcoded 5-second timeout might be too short for slow networks or too long for fast local connections.
| print(" Step 3: Ollama Setup (no API key required)") | ||
| base_url = get_ollama_base_url() | ||
| print(" Checking Ollama at %s..." % base_url, end=" ", flush=True) | ||
|
|
There was a problem hiding this comment.
💡 The timeout in _step_ollama_setup() is hardcoded to 5 seconds (line 68). Consider extracting this as a constant shared with validate_api_key() in auth.py for consistency.
🧜♀️ Iara Code Review✅ Iara Approved: Well-structured Ollama integration with proper error handling and no critical issues found Reviewed by Iara - AI Code Reviewer |
…dling - Extract OLLAMA_CONNECT_TIMEOUT = 5 constant in auth.py; use it in auth.py, init.py, and reviewer.py (resolves Iara review comments) - Add URLError catch in review_code_with_model() for Ollama with friendly "ollama serve" message when connection refused - tests/test_auth.py: add TestOllamaAuth covering get_ollama_base_url, NO_AUTH_PROVIDERS, resolve_api_key, normalize_provider, validate_api_key for Ollama (running, not running, unexpected error) - tests/test_reviewer.py: add Ollama tests for _build_headers (no auth), _extract_content (message.content format), _get_ollama_models (success, error, empty), review_code_with_model (success, connection refused, env URL), review_code (auto-detect, not running, preferred model) - tests/test_init.py: add Ollama tests for _step_provider, _step_api_key, _step_ollama_setup (running with models, not running, no models), run_init full flow without API key https://claude.ai/code/session_01Qq6RQNZHureRTZstKvBT8s
🧜♀️ Iara Code Review✅ Iara Approved: Ollama integration is well-implemented with proper error handling, security considerations, and comprehensive documentation. No critical issues found. Reviewed by Iara - AI Code Reviewer |
Adds Ollama local LLM provider support (issue #76). https://claude.ai/code/session_01Qq6RQNZHureRTZstKvBT8s
🧜♀️ Iara Code Review✅ Iara Approved: Ollama integration is well-implemented with proper error handling, security considerations, and comprehensive documentation. No critical issues found. Reviewed by Iara - AI Code Reviewer |
Summary
This PR adds Ollama as a 6th LLM provider, enabling users to run code reviews entirely locally without API keys or data leaving their infrastructure. Ollama is ideal for enterprises and regulated environments requiring on-premise inference.
Key Changes
Core Provider Support
ollamaentry toPROVIDER_CONFIGSwithauth_type: "none"and suggested models (qwen2.5-coder:7b,codellama:13b,llama3.1:8b,deepseek-coder:6.7b)NO_AUTH_PROVIDERSconstant for providers requiring no API keyget_ollama_base_url()to resolveOLLAMA_BASE_URLenv var (defaults tohttp://localhost:11434)resolve_api_key()to return(None, "none")for Ollama/api/tagsendpoint ping (no API key needed)Request/Response Handling
_build_headers()to skip Authorization header forauth_type: "none"_extract_content()support for Ollama'smessage.contentresponse format (vs OpenAI'schoices[0].message.content)_get_ollama_models()helper to auto-discover locally installed models from/api/tagsreview_code_with_model()to dynamically resolve Ollama base URLreview_code()to auto-detect available models and provide friendly error when Ollama is not runningSetup & Configuration
"ollama"toPROVIDER_OPTIONS_step_ollama_setup()to check Ollama connectivity and list available models (no API key prompt)_step_api_key()to redirect Ollama to setup function_save_configs()to skip API key persistence for no-auth providersNonefor no-auth providersDocumentation & Specs
Implementation Details
message.contentdirectly, not nested inchoices[0]OLLAMA_BASE_URLenv var for remote Ollama servershttps://claude.ai/code/session_01Qq6RQNZHureRTZstKvBT8s