Thinkreview now supports Ollama (AI code reviews for your PRs in your browser) #17
mshenawy22
announced in
Announcements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey everyone! 👋
I’m excited to announce that ThinkReview v1.4.0 now officially supports Ollama!
This has been one of the most requested features, and for good reason. Many of us work on private codebases where sending diffs to external APIs (like OpenAI or Anthropic) isn't an option due to security policies or privacy concerns.
With this update, you can now run your AI code reviews 100% locally.
🌟 What’s New?
You can now connect ThinkReview to your local Ollama instance. This means you can use powerful open-source models directly inside your PRs/MRs without your code ever leaving your machine.
Supported models include:
Qwen Coder (Highly recommended for code tasks)
Llama 3
DeepSeek
Codestral
...and any other model you have pulled into Ollama!
🔒 Why use Local Mode?
Privacy: Zero data egress. Your code stays on your device.
Cost: Free. No API keys or subscription costs.
Speed: Latency depends on your hardware, not your internet connection.
🛠️ How to set it up
Install Ollama from ollama.com.
Pull a model (e.g., ollama pull qwen2.5-coder).
Open the ThinkReview extension settings.
Select Ollama as your provider.
Start reviewing!
You can grab the latest version of the extension here: [Chrome Web Store Link]
🗣️ Feedback Needed
I’d love to hear which local models you find perform best for code reviews. If you run into any issues connecting to your local instance (CORS issues are common with default Ollama setups, check the README for help),
please let me know in the comments below or open an issue.
Happy reviewing! 🚀 Jay
Beta Was this translation helpful? Give feedback.
All reactions