Skip to content

Commit 0dfa17a

Browse files
authored
Week 09 update by Om Santosh Suneri (sugarlabs#351)
1 parent 72eb525 commit 0dfa17a

File tree

1 file changed

+156
-0
lines changed

1 file changed

+156
-0
lines changed
Lines changed: 156 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,156 @@
1+
---
2+
title: "GSoC’25 Week 09 Update by Om Santosh Suneri"
3+
excerpt: "AI-powered Debugger for Music Blocks"
4+
category: "DEVELOPER NEWS"
5+
date: "2025-08-04"
6+
slug: "2025-08-04-gsoc-25-omsuneri-week09"
7+
author: "@/constants/MarkdownFiles/authors/om-santosh-suneri.md"
8+
tags: "gsoc25,sugarlabs,week09,Debugger,AI,Music Blocks"
9+
image: "assets/Images/GSOC.png"
10+
---
11+
12+
<!-- markdownlint-disable -->
13+
14+
# Week 09 Progress Report by Om Santosh Suneri
15+
16+
**Project:** [AI-powered Debugger for Music Blocks](https://github.com/omsuneri/AI-powered-Debugger-for-Music-Blocks)
17+
**Mentors:** [Walter Bender](https://github.com/walterbender/) [Sumit Srivastava](https://github.com/sum2it)
18+
**Assisting Mentors:** [Devin Ulibarri](https://github.com/pikurasa/)
19+
**Reporting Period:** 2025-07-28 - 2025-08-03
20+
21+
---
22+
23+
## Goal for This Week
24+
25+
The primary goal for this Week was to **embed the AI-powered Debugger directly into the Music Blocks UI** so that users, especially kids, could debug their projects *without leaving the app*. This meant replacing the previously external Streamlit interface with a fully functional, session-aware UI widget inside the Music Blocks canvas, connected to the FastAPI backend.
26+
27+
---
28+
29+
## This Week’s Achievements
30+
31+
### Introduction
32+
33+
Until now, users had to export their Music Blocks project and visit an external site to use the AI Debugger. This week marks a key turning point: **the debugger is now available as a draggable widget inside the Music Blocks interface**. With a fully integrated chat UI, live connection to the backend, and context-rich messaging, the experience has become smoother and more educational.
34+
35+
This change not only improves usability but also opens the door for better debugging workflows, especially for children and educators.
36+
37+
### What I Did
38+
39+
Here’s a breakdown of what was implemented and how:
40+
41+
#### 1. **Custom Music Blocks Widget**
42+
43+
* Developed a new **UI widget** inside Music Blocks.
44+
* The widget supports a **session-aware chat interface**: users can ask the AI for help, and the assistant retains context across messages.
45+
* Integrated controls like:
46+
47+
* Reset conversation
48+
* Download session history
49+
* Minimize/maximize
50+
* Re-analyze current code
51+
52+
#### 2. **Connected to FastAPI Backend**
53+
54+
* The widget sends the current project’s JSON representation to the FastAPI backend.
55+
* Backend runs a custom module (`convert_music_blocks`) that:
56+
57+
* Extracts block types, structure, and key events from the Music Blocks JSON.
58+
* Converts that into a **natural-language program summary**.
59+
* Context (code + prompt history) is passed to Gemini via a modular prompt manager.
60+
* Gemini responds in a **teacher-like tone** based on prompt count:
61+
62+
* First prompt: Friendly, curious, Socratic
63+
* Later prompts: More directive and explicit
64+
65+
#### 3. **Conversation Session Management**
66+
67+
* Sessions are now **stateless in the frontend but tracked via conversation ID**.
68+
* Every prompt includes metadata:
69+
70+
* `prompt_number`
71+
* `history` (with speaker names and timestamps)
72+
* System prompt fingerprint
73+
* This ensures that users get progressive help and avoids repeated suggestions.
74+
75+
#### 4. **CORS + JSON Fixes + Robust API Calls**
76+
77+
* Added full CORS support for local and production deployment.
78+
* Built fallback messages and UI alerts for:
79+
80+
* Invalid JSON export
81+
* Backend timeout
82+
* Gemini failure
83+
84+
---
85+
86+
### Preview
87+
88+
<a href=""><img src="https://i.ibb.co/VYhCQzjL/Screenshot-2025-07-29-at-9-45-19-PM.png" alt="Music Blocks Debugger"/></a>
89+
90+
Here’s what the new AI Debugger UI inside Music Blocks includes:
91+
92+
* A draggable debugger block in the block palette
93+
* Chat interface with:
94+
95+
* Conversation bubbles
96+
* Reset options
97+
* One-click analysis of current project (auto-export + backend call)
98+
* System prompt: Designed for kids, friendly, and context-aware
99+
100+
Here’s a sample flow:
101+
102+
1. User opens the AI Debugger widget.
103+
2. The debugger auto-loads current project code.
104+
3. Debugger types out a friendly greeting and a first view of the project.
105+
4. User types: “Why is the melody not playing after this repeat block?”
106+
5. Gemini (via FastAPI) responds: “It seems you’ve placed the melody inside a block that never runs. Try moving it outside the repeat loop.”
107+
108+
And it all happens **without leaving Music Blocks**.
109+
110+
---
111+
112+
### Why It Matters
113+
114+
This integration transforms the debugging experience:
115+
116+
* **No context switching** → learners stay inside Music Blocks.
117+
* **Seamless feedback loop** → real-time guidance as they build music.
118+
* **Age-appropriate design** → kids get hints, not just fixes.
119+
* **Educator-friendly** → can be used in classrooms without external tools.
120+
121+
From an architectural point of view:
122+
123+
* The modular backend and frontend now follow a **clean interface contract** (`/analyze` API).
124+
* The design is future-proof — e.g., can support multiple LLMs, save sessions to the cloud, or run offline with local models.
125+
126+
---
127+
128+
### Final Thoughts
129+
130+
This week’s milestone is a big leap towards **making Music Blocks truly AI-augmented**. Embedding the debugger inside the platform not only enhances the UX but also lays the groundwork for advanced features like Intelligent code suggestions.
131+
132+
As a developer, this week helped me deeply understand:
133+
134+
* How to write modular UI widgets in Music Blocks
135+
* How to structure backend APIs for interactive apps
136+
* How to balance UX for kids with technical depth in LLM prompts
137+
138+
---
139+
140+
## Next Week’s Roadmap
141+
142+
**Refining the debugger widget based on feedback from the community and mentors, and deploying the backend for seamless integration.**
143+
144+
## Resources & References
145+
146+
- **Repository:** [JSON to Text representation](https://github.com/omsuneri/JSON-to-Text-representation)
147+
- **Repository:** [AI-powered Debugger for Music Blocks](https://github.com/omsuneri/AI-powered-Debugger-for-Music-Blocks)
148+
- **Debugger Streamlit App:** [Music Blocks Debugger](https://debuggmb.streamlit.app/)
149+
- **Directory for Projects:** [Embedding Project Set](https://github.com/omsuneri/AI-powered-Debugger-for-Music-Blocks/tree/main/data/docs)
150+
151+
152+
## Acknowledgments
153+
154+
Grateful as always to my mentors and the Sugar Labs community for their thoughtful feedback, patience, and encouragement as I shape this into a usable tool for learners.
155+
156+
---

0 commit comments

Comments
 (0)