Skip to content

Conversation

inugr8
Copy link

@inugr8 inugr8 commented Sep 23, 2025

Summary by CodeRabbit

  • New Features

    • Introduced Insights API generating concise takeaways from SQL results, toggleable via INSIGHTS_ENABLED.
    • Supports OpenAI and local Ollama models with easy switching.
    • Added wren-ai-service with health check and routes.
    • Provided docker-compose to run engine, ibis, AI service, and UI together.
  • Bug Fixes

    • Improved startup resilience: missing SQLite tables no longer crash; affected flows return empty results and continue.
  • Documentation

    • Added “README_START_HERE” with end-to-end setup, env, Ollama, patches, and validation steps.
  • Chores

    • Switched submodule URL to HTTPS.
    • Added/updated Dockerfiles and .dockerignore for wren-ai-service.

Copy link
Contributor

coderabbitai bot commented Sep 23, 2025

Caution

Review failed

The pull request is closed.

Walkthrough

Adds docker-compose orchestration and a new Node/Express-based wren-ai-service with LLM-backed insights (OpenAI/Ollama) and health endpoint. Updates UI server to handle missing-table SQLite errors gracefully. Introduces setup README, adjusts Docker-related files, and switches a submodule URL from SSH to HTTPS.

Changes

Cohort / File(s) Summary
Submodule config
\.gitmodules
Change wren-engine submodule URL from SSH to HTTPS.
Docs
README_START_HERE.md
New setup guide for WrenAI Starter, environment, Ollama usage, patch steps, and validation.
Orchestration
docker-compose.yml
New compose file defining four services: java-engine, ibis-server, wren-ai-service, wren-ui with ports, deps, env.
AI service Docker setup
wren-ai-service/.dockerignore, wren-ai-service/Dockerfile, wren-ai-service/docker/Dockerfile
Simplify dockerignore; add Node 20 Alpine Dockerfile; replace Python/Poetry Dockerfile with Node-based build/run.
AI service app
wren-ai-service/src/index.ts, wren-ai-service/src/routes/index.ts, wren-ai-service/src/routes/insights.ts, wren-ai-service/src/services/insights.ts, wren-ai-service/src/providers/llm/index.ts, wren-ai-service/src/providers/llm/openai.ts, wren-ai-service/src/providers/llm/ollama.ts
New Express server, conditional /insights route, insights service using LLM, LLM provider factory supporting OpenAI/Ollama.
UI server robustness
wren-ui/src/apollo/server/repositories/baseRepository.ts, wren-ui/src/apollo/server/services/askingService.ts
Replace findAllBy with findAll variants; handle missing-table SQLITE errors by returning []/warning instead of throwing.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant U as User
  participant UI as wren-ui (Frontend)
  participant S as wren-ai-service
  participant R as Routes (/insights)
  participant SV as Insights Service
  participant L as LLM Provider
  note over UI,S: WREN_API_BASE -> http://wren-ai-service:7000

  U->>UI: Request insights
  UI->>S: POST /insights { sql, rows, columns }
  S->>R: Route dispatch
  alt INSIGHTS_ENABLED = true
    R->>SV: createInsights(input)
    SV->>SV: Build prompt (sample rows, columns, SQL)
    opt LLM kind selection
      SV->>L: getInsightsLlm() → OpenAI or Ollama
    end
    SV->>L: generate(prompt)
    L-->>SV: insights text
    SV-->>R: { insights }
    R-->>S: 200 OK JSON
    S-->>UI: insights payload
  else disabled
    R-->>S: 200 OK { insights: [], disabled: true }
    S-->>UI: disabled response
  end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

Suggested labels

wren-ui, module/ai-service, ci/ai-service, wren-ai-service, deployment, ai-env-changed

Suggested reviewers

  • imAsterSun
  • yichieh-lu

Poem

A bunny boots containers with a hop,
New routes nibble insights, nonstop.
OpenAI or Ollama, pick your chew,
If tables vanish—no panic, phew!
Compose the stacks, ports align—
Carrots compiled, everything’s fine. 🥕🐇

✨ Finishing touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

📜 Recent review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 069bd25 and 7049add.

⛔ Files ignored due to path filters (1)
  • wren-ui/.wren/dev.db is excluded by !**/*.db
📒 Files selected for processing (15)
  • .gitmodules (1 hunks)
  • README_START_HERE.md (1 hunks)
  • docker-compose.yml (1 hunks)
  • wren-ai-service/.dockerignore (1 hunks)
  • wren-ai-service/Dockerfile (1 hunks)
  • wren-ai-service/docker/Dockerfile (1 hunks)
  • wren-ai-service/src/index.ts (1 hunks)
  • wren-ai-service/src/providers/llm/index.ts (1 hunks)
  • wren-ai-service/src/providers/llm/ollama.ts (1 hunks)
  • wren-ai-service/src/providers/llm/openai.ts (1 hunks)
  • wren-ai-service/src/routes/index.ts (1 hunks)
  • wren-ai-service/src/routes/insights.ts (1 hunks)
  • wren-ai-service/src/services/insights.ts (1 hunks)
  • wren-ui/src/apollo/server/repositories/baseRepository.ts (1 hunks)
  • wren-ui/src/apollo/server/services/askingService.ts (1 hunks)

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@inugr8 inugr8 closed this Sep 23, 2025
@inugr8 inugr8 deleted the feat/local-ollama-insights branch September 23, 2025 06:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant