|
| 1 | +--- |
| 2 | +title: "GSoC '25 Week 05 Update by Shubham Singh" |
| 3 | +excerpt: "Building and testing out the Image to video player" |
| 4 | +category: "DEVELOPER NEWS" |
| 5 | +date: "2025-07-05" |
| 6 | +slug: "2025-07-05-gsoc-25-firepheonix-week05" |
| 7 | +author: "@/constants/MarkdownFiles/authors/shubham-singh.md" |
| 8 | +tags: |
| 9 | + - gsoc25 |
| 10 | + - sugarlabs |
| 11 | + - week02 |
| 12 | + - firepheonix |
| 13 | +image: "assets/Images/GSOC.png" |
| 14 | +--- |
| 15 | + |
| 16 | +<!-- markdownlint-disable --> |
| 17 | + |
| 18 | +# Week 5 Progress Report by Shubham Singh |
| 19 | + |
| 20 | +**Project:** [Color Sensor for Music Blocks](https://github.com/sugarlabs/musicblocks/issues/4537) |
| 21 | +**Mentors:** [Devin Ulibarri](https://github.com/pikurasa), [Walter Bender](https://github.com/walterbender) |
| 22 | +**Reporting Period:** 2025-07-01 – 2025-07-07 |
| 23 | + |
| 24 | +--- |
| 25 | + |
| 26 | +## Goals for This Week |
| 27 | + |
| 28 | +- Fix some more issues in the scanning that are visible to me. |
| 29 | +- Implementing the dynamic input of NOTES like in the phrase maker. |
| 30 | + |
| 31 | +--- |
| 32 | + |
| 33 | +## This Week's Achievements |
| 34 | + |
| 35 | +1. **Implemented dynamic input of Notes like in phrase maker.** |
| 36 | + - So the very good thing about music blocks is, if you want to implement something, it's most LIKELY already there. Haha. So I scanned through the entire code of phrase maker's code and found out the code responsible for what type of input is taking place. Since I already extended the class LegoBricksBlock with StackClampBlock class, the input already worked, but for print block, since I used it as a dummy for future. |
| 37 | + - Well turns out everything fell right into place and I figured out NOT only just the input, but the output of how phrase maker took place and I implemented it on to my own LegoBrick.js . I now completely understand how the output is working out. |
| 38 | +  |
| 39 | + |
| 40 | + <iframe width="800" height="405" src="https://www.youtube.com/embed/ObNYq29QHZw?si=uoYPchoQUhbEnmY4" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe> |
| 41 | + |
| 42 | + |
| 43 | +2. **Implemented the complete phrase maker pitch functionality by adding PitchBlocks.js.** |
| 44 | + - I figured out how to configure the notes, yet struggled with hardcoded value on the octave, which is by default, 4. |
| 45 | + - So I had to literally, manually go over the entire pitch-blocks.js and a few files. After making adjustments inside some of those files, I was finally able to get the octave as well. |
| 46 | + |
| 47 | +  |
| 48 | + |
| 49 | + <iframe width="800" height="405" src="https://www.youtube.com/embed/XpgFTjimPyc?si=fi9HxuN5o8BOP26J" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe> |
| 50 | + |
| 51 | +3. **Finally - Testing out Devin's CMK'24 project.** |
| 52 | + - I finally did what I've always wanted. It was Devin's experiments that inspired me to take up this ,project in core Javascript for music blocks. So, here's a quick summary, haha: |
| 53 | + |
| 54 | + - During the CMK workshop, Devin undertook a hands-on project titled “Lego Musical Notation for the Blind,” which originated as one of his own suggested ideas. The goal of the project was to create an accessible, tactile music notation system using Lego bricks that could help blind or visually impaired individuals read and compose music through touch. Devin collaborated with Jamie Chelel from MIT’s K12 Maker Lab, and together they quickly developed a physical prototype. The system used a large Lego baseplate as a grid, where the vertical axis represented pitch and the horizontal axis represented time. Lego bricks of different lengths were used to symbolize note durations, while their positions on the grid indicated pitch. They even designed a clef-like marker block to anchor reference pitches and help users navigate the register—similar to how a treble or bass clef works in traditional notation. |
| 55 | + |
| 56 | +  |
| 57 | + |
| 58 | + - After building the basic physical model, Devin shifted focus toward digitizing the system. He aimed to convert the tactile notation into a format that could be played back, edited, or exported to digital music platforms like MIDI or Music Blocks. To accomplish this, Devin explored using Scratch, despite only having limited prior experience with the platform. He investigated ways to use Scratch’s pixel color detection to scan an image of the Lego grid and extract musical data. After several attempts with image avatars and backgrounds, Devin found success using a live webcam feed, allowing Scratch’s sprite (the "glyph") to visually scan the Lego structure in real-time. He wrote scripts for the sprite to detect notes first vertically (to determine pitch) and then horizontally (to determine timing), and made the system flexible enough to accommodate any number of pitches or scales. Though Devin managed to create a functional prototype of this scanning system, he encountered challenges when trying to assign precise, meaningful pitches within Scratch’s limitations. Despite not fully resolving that technical hurdle, his work represented a creative and technically ambitious attempt to bridge physical, accessible music tools with computational music platforms. |
| 59 | + |
| 60 | + <iframe title="vimeo-player" src="https://player.vimeo.com/video/983707992?h=0b765ba25a" width="800" height="405" frameborder="0" allowfullscreen></iframe> |
| 61 | + |
| 62 | + - Now, as you can read in my previous, week04 blog, I was able to completely overcome the Technical Hurdle of not being able to detect colors with a very high accuracy from static images. |
| 63 | + |
| 64 | + - And, finally here's how the scanning result looks like. |
| 65 | + |
| 66 | +  |
| 67 | + |
| 68 | +  |
| 69 | + |
| 70 | + - I realize that the output is coming ONE ROW down. also, the print functionality, it's automatically adjusting the rows ONE down. Still left with some adjustments to make something to show across all the blocks. |
| 71 | + |
| 72 | + - This still won't cut it. It has to be more accurate than this. I'll continue to optimize further, but I'll move on to the next phase, which is music generation from next week. |
| 73 | + |
| 74 | +  |
| 75 | + |
| 76 | +--- |
| 77 | + |
| 78 | +## Challenges & How I Overcame Them |
| 79 | + |
| 80 | +- **Challenge:** Reading through multiple files |
| 81 | +**Solution:** No solution, just did the old-fashioned, long way. Also read some documentation |
| 82 | +- **Challenge:** Have a lot of college related in the upcoming week. |
| 83 | +**Solution:** Woke up 2-3 sleepless nights to get more time?? Yeah that's what I did. |
| 84 | + |
| 85 | +--- |
| 86 | + |
| 87 | +## Key Learnings |
| 88 | + |
| 89 | +- If you're building something, for example, a full-stack frontend backend website, you shouldn't vibe code it with AI and THEN run into errors. Create a basic frontend -> Send post requests with some tool -> Then connect the backend. . And this applied here as well since I put the NOTES as input functionality first THEN the pitch. Build in steps. Plan it out better. Even better, use some LLM to plan it out for you step by step. |
| 90 | +- I tried learning a LOT of stuff this week. I'm learning how CORE JAVASCRIPT in itself works and it's an amazing opportunity. I never knew any of the browser storage concepts, or for that matter, time complexity practical use case before hand. I'm just learning so in-depth. It's crazy good. |
| 91 | + |
| 92 | +--- |
| 93 | + |
| 94 | +## Next Week's Roadmap |
| 95 | + |
| 96 | +- Now we are getting on the main part, which is producing musical sounds with the printed output. I still have to figure out a way ALONG with that I also have my college related work I've got to do. |
| 97 | +- Figuring out the when does music production START. This was mentioned by my mentor, Walter, that the music should start playing the moment the algorithm bumps into the FIRST color change from green. That's a START point. |
| 98 | + |
| 99 | +--- |
| 100 | + |
| 101 | +## Resources & References |
| 102 | + |
| 103 | +- **Nothing much, just Music Blocks documentation sufficed** |
| 104 | + |
| 105 | +--- |
| 106 | + |
| 107 | +## Acknowledgments |
| 108 | + |
| 109 | +Thank you to my mentors [Walter Bender](https://github.com/walterbender) and [Devin Ulibarri](https://github.com/pikurasa) for invaluable guidance throughout this development phase. I like how Devin actually reads everyone's blogs every single week. He's an inspiring person. |
| 110 | +PS: If you're reading this blog Devin, I hope you're enjoying the details. |
| 111 | + |
| 112 | +--- |
0 commit comments