RAG system that lets you ask questions about Tim Walz and receive concise fact-checking answers with sources. It leverages a FAISS index to retrieve relevant context from pre-collected statements and a locally saved language model (using Hugging Face Transformers) to generate responses. The user interface is built with Streamlit.
- FAISS Retrieval: Efficiently retrieves the most relevant context statements using vector similarity search.
- LLM: Uses a language model to generate fact-checking answers.
- Streamlit Chat Interface: Provides an interactive, chat-style UI for asking questions and viewing responses.
- Concise Output: Processes the generated answer to display only the final YES/NO decision along with one key supporting statement.
- Clone the repository
git clone https://github.com/yourusername/fact-checker.git
- Install dependencies
pip install -r requirements.txt
- Run
build_index.py
script for creating and saving an efficient similarity search index using FAISS.
python build_index.py
- Downloads a model and tokenizer from Hugging Face, then stores them in a local directory
python save_model.py
Launces app.py and fact_checker.py
streamlit run /Users/hardikgupta/Documents/Projects/model/app.py
Confirm that the directory contains the necessary files (faiss_index.bin
, statements.pkl
, my_local_model/
). If not, follow the instructions again provided in the project documentation to generate or download these files.
Contributions are welcome!