Skip to content

Add Ollama provider for local, API-key-free code reviews#89

Merged
felipefernandes merged 4 commits into
mainfrom
claude/local-llm-support-KtZhS
Mar 28, 2026
Merged

Add Ollama provider for local, API-key-free code reviews#89
felipefernandes merged 4 commits into
mainfrom
claude/local-llm-support-KtZhS

Conversation

@felipefernandes
Copy link
Copy Markdown
Owner

Summary

This PR adds Ollama as a 6th LLM provider, enabling users to run code reviews entirely locally without API keys or data leaving their infrastructure. Ollama is ideal for enterprises and regulated environments requiring on-premise inference.

Key Changes

Core Provider Support

  • iara/models.py: Added ollama entry to PROVIDER_CONFIGS with auth_type: "none" and suggested models (qwen2.5-coder:7b, codellama:13b, llama3.1:8b, deepseek-coder:6.7b)
  • iara/auth.py:
    • Introduced NO_AUTH_PROVIDERS constant for providers requiring no API key
    • Added get_ollama_base_url() to resolve OLLAMA_BASE_URL env var (defaults to http://localhost:11434)
    • Updated resolve_api_key() to return (None, "none") for Ollama
    • Implemented Ollama validation via /api/tags endpoint ping (no API key needed)

Request/Response Handling

  • iara/reviewer.py:
    • Updated _build_headers() to skip Authorization header for auth_type: "none"
    • Added _extract_content() support for Ollama's message.content response format (vs OpenAI's choices[0].message.content)
    • Added _get_ollama_models() helper to auto-discover locally installed models from /api/tags
    • Updated review_code_with_model() to dynamically resolve Ollama base URL
    • Enhanced review_code() to auto-detect available models and provide friendly error when Ollama is not running

Setup & Configuration

  • iara/init.py:
    • Added "ollama" to PROVIDER_OPTIONS
    • Implemented _step_ollama_setup() to check Ollama connectivity and list available models (no API key prompt)
    • Updated _step_api_key() to redirect Ollama to setup function
    • Modified _save_configs() to skip API key persistence for no-auth providers
  • iara/cli.py: Updated API key validation to allow None for no-auth providers

Documentation & Specs

  • docs/configuration.md: Added comprehensive Ollama section with installation, model recommendations, hardware requirements, configuration examples, and troubleshooting
  • README.md: Updated provider list and setup instructions to mention Ollama
  • OpenSpec: Added proposal, tasks, and spec delta documenting Ollama requirements and implementation

Implementation Details

  • Ollama requests use OpenAI-compatible message format (same payload as non-Anthropic providers)
  • Response parsing differs: Ollama returns message.content directly, not nested in choices[0]
  • No Authorization header sent for Ollama (auth_type: "none")
  • Custom endpoint support via OLLAMA_BASE_URL env var for remote Ollama servers
  • Graceful error handling when Ollama is not running, with actionable setup instructions
  • All existing providers remain unchanged; no breaking changes

https://claude.ai/code/session_01Qq6RQNZHureRTZstKvBT8s

claude added 2 commits March 28, 2026 22:28
- Add `ollama` as 6th provider to PROVIDER_CONFIGS (auth_type: none) and SUGGESTED_MODELS
- Introduce NO_AUTH_PROVIDERS set in auth.py; expand SUPPORTED_PROVIDERS to include ollama
- Add get_ollama_base_url() to resolve OLLAMA_BASE_URL env var (default: http://localhost:11434)
- Update validate_api_key() to ping /api/tags for Ollama (no key required)
- Update _build_headers() to skip auth header when auth_type is "none"
- Add _extract_content() case for Ollama response format (message.content)
- Add _get_ollama_models() to auto-detect locally installed models via /api/tags
- Update review_code_with_model() to resolve Ollama base_url dynamically
- Update review_code() to auto-detect models and show friendly error when Ollama not running
- Update iara init wizard: ollama skips API key step, shows available models, install instructions
- Add Ollama section to docs/configuration.md (install, hardware requirements, troubleshooting)
- Update README.md provider list and privacy comparison table

OpenSpec change: openspec/changes/add-ollama-provider/
Closes #76

https://claude.ai/code/session_01Qq6RQNZHureRTZstKvBT8s
- cli.py: skip API key guard for NO_AUTH_PROVIDERS (Ollama)
- auth_status.py: show Ollama connectivity status instead of key info

https://claude.ai/code/session_01Qq6RQNZHureRTZstKvBT8s
@sentry
Copy link
Copy Markdown

sentry Bot commented Mar 28, 2026

Codecov Report

❌ Patch coverage is 99.04762% with 3 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
iara/reviewer.py 94.28% 2 Missing ⚠️
iara/auth.py 95.23% 1 Missing ⚠️

📢 Thoughts on this report? Let us know!

Comment thread iara/auth.py

# Ollama: validate by pinging /api/tags (no API key needed)
if provider == "ollama":
base_url = get_ollama_base_url()
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Consider adding a timeout parameter to make it configurable. The hardcoded 5-second timeout might be too short for slow networks or too long for fast local connections.

Comment thread iara/init.py
print(" Step 3: Ollama Setup (no API key required)")
base_url = get_ollama_base_url()
print(" Checking Ollama at %s..." % base_url, end=" ", flush=True)

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 The timeout in _step_ollama_setup() is hardcoded to 5 seconds (line 68). Consider extracting this as a constant shared with validate_api_key() in auth.py for consistency.

@github-actions
Copy link
Copy Markdown

🧜‍♀️ Iara Code Review

✅ Iara Approved: Well-structured Ollama integration with proper error handling and no critical issues found


Reviewed by Iara - AI Code Reviewer

…dling

- Extract OLLAMA_CONNECT_TIMEOUT = 5 constant in auth.py; use it in
  auth.py, init.py, and reviewer.py (resolves Iara review comments)
- Add URLError catch in review_code_with_model() for Ollama with
  friendly "ollama serve" message when connection refused
- tests/test_auth.py: add TestOllamaAuth covering get_ollama_base_url,
  NO_AUTH_PROVIDERS, resolve_api_key, normalize_provider, validate_api_key
  for Ollama (running, not running, unexpected error)
- tests/test_reviewer.py: add Ollama tests for _build_headers (no auth),
  _extract_content (message.content format), _get_ollama_models (success,
  error, empty), review_code_with_model (success, connection refused,
  env URL), review_code (auto-detect, not running, preferred model)
- tests/test_init.py: add Ollama tests for _step_provider, _step_api_key,
  _step_ollama_setup (running with models, not running, no models),
  run_init full flow without API key

https://claude.ai/code/session_01Qq6RQNZHureRTZstKvBT8s
@github-actions
Copy link
Copy Markdown

🧜‍♀️ Iara Code Review

✅ Iara Approved: Ollama integration is well-implemented with proper error handling, security considerations, and comprehensive documentation. No critical issues found.


Reviewed by Iara - AI Code Reviewer

@felipefernandes felipefernandes merged commit 2abad9b into main Mar 28, 2026
3 checks passed
@felipefernandes felipefernandes deleted the claude/local-llm-support-KtZhS branch March 28, 2026 22:43
@github-actions
Copy link
Copy Markdown

🧜‍♀️ Iara Code Review

✅ Iara Approved: Ollama integration is well-implemented with proper error handling, security considerations, and comprehensive documentation. No critical issues found.


Reviewed by Iara - AI Code Reviewer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants