Forum Rule: Always post complete source code & details to reproduce any issue!
-
Work in progress....Swarming Emotional Pianos
Hi -
Just wanted to share this project I'm completing in the next 6 months or so, powered by...Teensy 3! (My new favorite)
http://youtu.be/q10fDu95IZw
The robot is an iRobot create, pretty fun to program. A Teensy3 microcontroller gets wireless messages from an Xbee attached to a Max/MSP patch, which in turn is controlling 3 high-speed servos sourced from http://www.servocity.com (which seemed to be the only place that carried fast enough servos to make music with).
The Teensy3, with its abundance of hardware serial ports, made it a really good microcontroller for both talking to the iCreate robot and to the Xbee simultaneously.
Even though the video doesn't show very much, the robots are indeed capable of rolling around and making music at the same time. I guess I will have to send an update vid with them making more music and rolling, in the future.
If anyone will be in or around Montreal at the end of May, the show debuts at the Elektra festival then.
Thanks!
Erin
-
This is outstanding work! Intriguing at any level, put particularly interesting to me is the very interdisciplinary nature of the project. Please do report back!
-
Very neat stuff! I wonder to what degree the representational aspects are preserved by the many stages of translation occurring here? You start with the emotion (anxiety) and you look at the neural spike recordings, and then you extract some features of that data and use those to drive the robots; do the features you've extracted still bear a meaningful relation to the emotion? I guess what I'm really wondering is how the robots respond if you're not anxious, and whether someone observing the robots could detect meaningful patterns that relate back to your emotional state.
It's also interesting to think about the psychoacoustic aspect of the performance, and whether it would be possible to extend this work in such a way that the sounds could be composed in such a way as to create an intuitive sense in the listener that corresponds with your internal emotional state. It seems like this sort of setup would be perfect for examining that kind of psychoacoustic property, in that achieving such a resonance would result in a positive feedback loop--you'd feel an emotion, music would be produced, the music would accentuate the same emotion and intensify the music, etc. Perhaps you could set up the robots with a learning system that trains them to try to alter their own performance so as to intensify the feedback they get from your system?
In any case, extremely cool work! Hope you'll continue to share here.
-
Nice project ! We should organize a Montreal Teensy user meetup
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules