Here is a fun project I would like to share with the community. It is a basic 3 oscillators synth plus an input follower stage. You can plug in a microphone and convert dynamics and notes of an acoustic instrument into frequency and envelope, and then trigger the synth oscillators and effects.
This was made possible thanks to the noteFreq object developed by Duff. This object uses the YIN algorithm to guess the notes frequency (almost) in realtime. Dynamics are analysed with the rms object.
Filtering the signal is important to get better frequency analysis, but also to limit the feedback when playing on stage.

Limitation is latency. It is possible to reduce latency with high pitch instruments (thank you Duff for your guidance). I doubt we can get good results with low pitch instrument like cellos, double bass, tenor and baryton sax... But it works pretty well with "treble key" instruments like the flute, the violin, the trumpet...

I am still trying to reduce latency. If someone has any suggestion, I would be very happy to exchange ideas.
Here are a few pics and a video.

Emmanuel (France)

Click image for larger version. 

Name:	front.jpg 
Views:	14 
Size:	138.5 KB 
ID:	20056

Click image for larger version. 

Name:	inside.jpg 
Views:	16 
Size:	173.9 KB 
ID:	20057