For my senior capstone project, I worked with my classmates Josh Rapp and Shayne Hubbard to design a low-latency, non-invasive system that converts an upright acoustic piano into a MIDI controller.
This was the third iteration of the Steinway project. This year we focused on the system's front-end; transforming the pianist's playing into MIDI data while preserving the nuances of the performance. By using force sensors at the hammers themselves, we were able to dramatically improve the dynamic range of the system while keeping the total latency on the order of about one millisecond. Check out our project in action:
My role in the project was developing the analog hardware that processed the raw signal from our sensors to be optimally interpreted by the microcontroller. The FSR circuit signals are read into the PIC32 analog pins to determine the time and velocity of the triggered note, and the piezo signals are read into the digital pins to indicate the note-off. A second microcontroller alters the data coming from the first into MIDI data, which can be used to control a synthesizer or even notate sheet music in real-time.
Here's a close-up of our system: