A modern Next.js application leveraging MongoDB Atlas Vector Search and LangGraph AI agents for intelligent loan portfolio insights and risk analysis.
- 🏗️ Architecture Overview
- 🔧 Prerequisites
- 🗄️ MongoDB Atlas Setup
- 🔐 Environment Configuration
- ⚡ Quick Start
- 🚀 Deployment
- 🛠️ Development
- 📊 Features
- 🔍 API Documentation
- 🎯 Usage Examples
- 🆘 Troubleshooting
- 🤝 Contributing
This application combines the power of MongoDB Atlas with modern AI technologies to deliver intelligent loan portfolio analysis:
Frontend │ Next.js 15 + React 19 + TypeScript + Mantine UI
AI/ML │ LangGraph + LangChain + OpenAI/Anthropic
Database │ MongoDB Atlas + Vector Search
Visualization│ Recharts + Plotly.js
Styling │ TailwindCSS
- 🤖 AI-Powered Analysis: LangGraph workflow engine with 7 domain-specific tools
- 🔍 Hybrid Search: Vector similarity + metadata filtering
- 📊 Interactive Dashboard: Real-time loan portfolio visualizations
- ⚡ Vector Search: 1536-dimension embeddings with semantic search
- 🛡️ Risk Assessment: Advanced ML-based risk scoring
- 📈 Portfolio Analytics: Comprehensive performance metrics
Before setting up the application, ensure you have:
- Node.js 18.17+ and npm 8+
- MongoDB Atlas account (free tier available)
- OpenAI API key (for embeddings and chat)
- Anthropic API key (for Claude models) [Optional]
- Git for version control
- Sign Up: Visit MongoDB Atlas and create a free account
- Verify Email: Check your email and verify your account
-
Create Organization (if first time):
- Click "Create Organization"
- Name it (e.g., "Loan Portfolio Analytics")
- Add members if needed
-
Create Project:
- Click "New Project"
- Name:
loan-portfolio-insights - Click "Create Project"
-
Deploy Cluster:
- Click "Create Deployment"
- Choose M0 FREE tier
- Cloud Provider: AWS (recommended)
- Region: Choose closest to your location
- Cluster Name:
loan-portfolio-cluster - Click "Create Deployment"
- Go to Database Access in left sidebar
- Click "Add New Database User"
- Authentication Method: Password
- Username:
loan-admin - Password: Generate secure password (save this!)
- Database User Privileges:
- Select "Read and write to any database"
- Click "Add User"
- Go to Network Access in left sidebar
- Click "Add IP Address"
- For development:
- Click "Allow Access from Anywhere" (0.0.0.0/0)
- Description: "Development Access"
- For production:
- Add your specific IP addresses
- Click "Confirm"
- Go to Database in left sidebar
- Click "Connect" on your cluster
- Choose "Connect your application"
- Driver: Node.js
- Version: 4.1 or later
- Copy Connection String:
mongodb+srv://loan-admin:<password>@loan-portfolio-cluster.xxxxx.mongodb.net/?retryWrites=true&w=majority - Replace
<password>with your actual password
- Go to Database → Search → Create Search Index
- Configuration Method: Visual Editor
- Database:
loan_portfolio - Collection:
loan_applications_demo - Index Name:
vector_index - Field Configuration:
{ "fields": [ { "path": "embedding", "numDimensions": 1536, "similarity": "cosine", "type": "vector" } ] } - Click "Create Search Index"
Create .env.local in your project root:
touch .env.localAdd the following to .env.local:
# 🗄️ MongoDB Atlas Configuration
MONGODB_URI=mongodb+srv://loan-admin:<password>@loan-portfolio-cluster.xxxxx.mongodb.net/loan_portfolio?retryWrites=true&w=majority
# 🤖 AI Model API Keys
OPENAI_API_KEY=sk-your-openai-api-key-here
ANTHROPIC_API_KEY=sk-ant-your-anthropic-key-here
# 🛠️ Development Configuration
NODE_ENV=development
NEXT_PUBLIC_APP_ENV=development
# 📊 Analytics (Optional)
NEXT_PUBLIC_ANALYTICS_ID=your-analytics-id- Visit OpenAI Platform
- Sign in or create account
- Go to API Keys section
- Click "Create new secret key"
- Name:
loan-portfolio-app - Copy the key and add to
.env.local
- Visit Anthropic Console
- Sign in or create account
- Go to API Keys
- Click "Create Key"
- Copy and add to
.env.local
git clone <repository-url>
cd loan-portfolio-insightsnpm installcp .env.example .env.local
# Edit .env.local with your MongoDB URI and API keys# This will create sample loan data with embeddings
Run the seed-data.ts file from lib folder with customisations for same data and related embeddings.npm run dev🎉 Success! Open http://localhost:3000 to view the application.
# Build for production
npm run build
# Start production server
npm startUpdate .env.local for production:
NODE_ENV=production
MONGODB_URI=your-production-mongodb-uri
# Add production API keys- Push code to GitHub
- Connect repository to Vercel
- Add environment variables in Vercel dashboard
- Deploy automatically
FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY . .
RUN npm run build
EXPOSE 3000
CMD ["npm", "start"]src/
├── app/
│ ├── components/ # React components
│ │ ├── LoanList.tsx # Loan listings
│ │ ├── Dashboard.tsx # Main dashboard
│ │ └── Charts/ # Visualization components
│ ├── lib/ # Business logic
│ │ ├── agent.ts # LangGraph AI agent
│ │ ├── mongodb.ts # Database connection
│ │ ├── types.ts # TypeScript definitions
│ │ └── seed-data.ts # Data generation
│ ├── api/ # API routes
│ │ ├── analyze/ # AI analysis endpoint
│ │ └── loans/ # Loan data endpoints
│ └── globals.css # Global styles
# Development with hot reload
npm run dev
# Type checking
npm run type-check
# Linting
npm run lint
# Build production
npm run build
# Start production
npm startThe application uses a unified loan schema defined in src/app/lib/types.ts:
interface LoanApplication {
_id: ObjectId;
applicant_details: ApplicantDetails;
loan_details: LoanDetails;
risk_metrics: RiskMetrics;
performance_data: PerformanceData;
communication_logs: CommunicationLog[];
embedding: number[]; // 1536-dimension vector
}- Portfolio Overview: Real-time metrics and KPIs
- Risk Analysis: Advanced risk scoring and assessment
- Applicant Search: Semantic search across loan applications
- Performance Tracking: Loan performance analytics
- Trend Analysis: Historical pattern recognition
- Compliance Monitoring: Regulatory compliance checks
- Predictive Analytics: ML-based forecasting
- Semantic Search: Natural language queries
- Hybrid Search: Vector + metadata filtering
- Faceted Search: Multi-dimensional filtering
- Real-time Results: Instant search responses
- Interactive Dashboards: Real-time data visualization
- Risk Heat Maps: Visual risk distribution
- Performance Charts: Trend analysis charts
- Portfolio Metrics: KPI dashboards
POST /api/analyze
Analyze loan portfolio using AI agent.
curl -X POST http://localhost:3000/api/analyze \
-H "Content-Type: application/json" \
-d '{"query": "Show me high-risk loans approved in the last month"}'Response:
{
"response": {
"type": "analysis",
"data": [...],
"summary": "Found 15 high-risk loans approved recently..."
}
}GET /api/loans
Retrieve loan applications.
curl http://localhost:3000/api/loans?limit=10&status=approvedconst response = await fetch('/api/analyze', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
query: "What are the top risk factors in our current portfolio?"
})
});const analysis = await fetch('/api/analyze', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
query: "Show portfolio performance trends for the last 6 months"
})
});const search = await fetch('/api/analyze', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
query: "Find software engineers with good credit history"
})
});Error: Please add your MongoDB URI to .env.local
Solution:
- Verify
.env.localexists - Check MongoDB URI format
- Ensure IP address is whitelisted
- Verify database user credentials
Error: OpenAI API key not found
Solution:
- Check
.env.localforOPENAI_API_KEY - Verify API key is valid
- Check API key permissions
- Restart development server
Solution:
- Verify Atlas Search index is created
- Check index configuration matches schema
- Ensure embeddings are generated
- Wait for index to be ready (can take a few minutes)
- Connection Pooling: MongoDB connections are pooled automatically
- Caching: Implement Redis caching for frequent queries
- Indexing: Ensure proper database indexes
- Batch Processing: Process large datasets in batches
We welcome contributions! Please follow these steps:
- Fork the repository
- Create feature branch (
git checkout -b feature/amazing-feature) - Commit changes (
git commit -m 'Add amazing feature') - Push to branch (
git push origin feature/amazing-feature) - Open Pull Request
- Follow TypeScript best practices
- Write tests for new features
- Update documentation
- Follow MongoDB naming conventions
- Use semantic commit messages
- Document Model: Flexible schema for loan data
- Vector Search: Semantic search capabilities
- Aggregation Pipeline: Complex analytics queries
- Atlas Search: Full-text search with faceting
- Change Streams: Real-time data updates
- Charts: Built-in data visualization
Built with ❤️ using MongoDB Atlas
MongoDB Atlas • Next.js • LangChain • Mantine