This project is an AI-powered mental wellness assessment system that integrates voice conversation analysis with real-time gaze tracking and emotion recognition to provide a comprehensive mental health assessment report.
- Conversational AI Assessment: Engages users in empathetic conversations to evaluate mental well-being.
- Real-time Gaze Tracking: Analyzes eye movements to infer attention, focus, and cognitive states.
- Real-time Emotion Recognition: Detects and interprets facial expressions to understand emotional states.
- Comprehensive Assessment Reports: Generates detailed reports synthesizing data from voice, gaze, and emotion analysis, including personalized recommendations.
- Supportive and Empathetic AI: Designed to be a compassionate mental health assistant, avoiding clinical diagnoses.
- Gaze Tracking Service: Handles real-time eye tracking analysis (e.g.,
http://127.0.0.1:8001). - Emotion Recognition Service: Processes facial expressions for emotion detection (e.g.,
http://127.0.0.1:8000). - Voice Chat/TTS Service: Manages AI conversation and text-to-speech (e.g.,
http://127.0.0.1:8002).
- Node.js (with npm or yarn)
- Python 3.9+
- Pip (Python package installer)
git clone <repository_url>
cd ai-mental-health-therapist-appNavigate to the backend directory:
cd backendInstall Python dependencies:
pip install -r requirements.txtCreate a .env file in the backend/app directory with the following :
GROQ_API_KEY="your_groq_api_key_here"
DEEPGRAM_API_KEY="your_deepgram_api_key_here"
PINECONE_API_KEY="your_pinecone_api_key_here"
PINECONE_INDEX_NAME=ai-agent
COHERE_API_KEY="your_cohere_api_key_here"
GAZE_CAPTURE_URL=http://127.0.0.1:8001/capture-eye-tracking
GAZE_REPORT_URL=http://127.0.0.1:8001/generate-eye-tracking-report
EMOTION_API_URL=http://127.0.0.1:8000/analyze-live-emotion
CHAT_API_URL=http://127.0.0.1:8002/chat
TRANSCRIPT_GET_URL=http://127.0.0.1:8002/transcript
SESSION_API_URL=http://127.0.0.1:8003/start-session
REPORT_GET_URL=http://127.0.0.1:8003/get-report
You might also need to configure the URLs for the microservices in the .env file if they are not running on the default ports:
GAZE_CAPTURE_URL="http://localhost:8001/capture-eye-tracking"
GAZE_REPORT_URL="http://localhost:8001/generate-eye-tracking-report"
EMOTION_API_URL="http://127.0.0.1:8000/analyze-live-emotion"
CHAT_API_URL="http://127.0.0.1:8002/chat"
TRANSCRIPT_GET_URL="http://127.0.0.1:8002/transcript"
Run the FastAPI backend:
uvicorn app.main:app --reload --port 8003(Ensure your gaze tracking, emotion recognition, and voice chat/TTS microservices are also running on their respective ports, e.g., 8000, 8001, 8002.)
Open a new terminal, navigate to the frontend directory:
cd ../frontendInstall Node.js dependencies:
npm install
# or yarn installStart the React development server:
npm start
# or yarn startThe frontend application should open in your browser at http://localhost:3000.
- Ensure all backend services (main API, gaze, emotion, voice chat) are running.
- Open the frontend application in your browser.
- Click "Start Assessment" to begin the AI-guided conversation.
- Allow webcam and microphone access when prompted.
- Engage in conversation with the AI assistant.
- Click "End Session" to generate the comprehensive report.
For any questions or issues, please open an issue in the repository.