title: PromptPandemic Dynamic Form Builder emoji: 🧙♀️ colorFrom: pink colorTo: purple sdk: streamlit sdk_version: 1.36.0 app_file: app.py pinned: false PromptPandemic Dynamic Form Builder
Build. Bond. Breakthrough. (Vibe Coding Hackathon Entry)
This project is an AI-first application built for the PromptPandemic Vibe Coding Hackathon. It demonstrates the ability to translate unstructured natural language requests into a strictly structured, functional web form instantly.
The core innovation is connecting a local Large Language Model (LLama 3) to a dynamic UI generator (Streamlit) using Pydantic for schema enforcement. This allows the system to:
-
Generate Forms: Convert descriptions like "A club sign-up form with name, email, and t-shirt size options" into a working web form.
-
Handle Contradictions: Detect logical flaws in the user's request (e.g., "anonymous, but collect a phone number") and politely ask for clarification instead of failing.
-
Dynamic Validation: Infer validation rules (required, min_length, email_format) from the prompt and enforce them client-side.
This application is designed to run completely locally using your machine's GPU for inference (Apple Silicon M-series recommended).
You must have the following installed:
- Python 3.9+
- Ollama: The easiest way to run local LLMs.
- Installation: Download the installer for macOS or Linux from the Ollama website.
-
Clone the Repository
git clone https://github.com/YourUsername/PromptPandemic-DynamicFormBuilder.git cd PromptPandemic-DynamicFormBuilder -
Install Python Dependencies Create and activate a virtual environment, then install the necessary Python packages:
python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt -
Download the LLama 3 Model (CRITICAL) This project requires the Llama 3 model to be running locally.
-
Start the Ollama Server: Open a separate terminal window and run this command. Keep this window open for the entire time you are using the app.
ollama serve
-
Pull the Model: In a third terminal window, download the model:
ollama pull llama3
-
-
Run the Application Once the model pull is complete and the Ollama server is running, you can launch the app:
streamlit run app.py
Your browser will automatically open the application at http://localhost:8501.
- Natural Language Form Generation: Describe the form you need, and AI creates it
- Dynamic Form Rendering: Automatically renders forms with appropriate input fields
- Data Collection & Storage: Captures and stores form submissions
- Interactive Admin Dashboard: Visualize submission data with charts and graphs
- AI-Powered Insights: Get intelligent analysis of your collected data
- Modern UI/UX: Clean, responsive dark-themed interface
- Frontend: Streamlit with custom CSS styling
- AI Integration: LangChain with Ollama (Llama 3)
- Data Visualization: Plotly Express
- Data Handling: Pandas with CSV storage
- Authentication: Simple password protection for admin area
- Python 3.8+
- Ollama with Llama 3 model installed
- Required Python packages (see Installation)
-
Clone the repository:
git clone https://github.com/yourusername/formforge.git cd formforge -
Install required packages:
pip install streamlit pandas plotly langchain-community langchain-core
-
Install Ollama and pull the Llama 3 model:
# Follow instructions at https://ollama.com to install Ollama ollama pull llama3
- Enter a natural language description of your desired form
- Click "Generate Form" to create your form
- The AI will analyze your request and generate appropriate form fields
- Navigate to the Admin Dashboard from the navigation menu
- Enter the password (default: "hackathon2025")
- View submission data, charts, and AI-generated insights
This application uses a simple password mechanism for the admin area. For production use, implement proper authentication and secure your data appropriately.
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.