Greetings everyone,
I am looking at putting together a drone synth using a Teensy 3.5. I am using 6 pots, 2 buttons, 1 rotary encoder and an 16x2 LCD to control the parameters of the various audio objects. I will also eventually be adding preset storage and recall. I have put together the hardware for this and it is working well. I am also comfortable putting the software together for this understandably gargantuan task and will be throwing the code and schematics up on github at https://github.com/mbryne/msynth and sharing progress as I go.
The synth will find itself home amongst a few other keyboard synths and Ableton Live and the NDLR super arpeggiator, the latter of which has a drone note playing on every down beat which has prompted me to put together a drone synth for low, ambient audio textures.
Alongside the above features and basic MIDI input I am also hoping to put together a rudimentary pattern editor based on something like the Adafruit FifteenStep Sequencer. It seems though that this and most sequencers are focused on triggering external instruments.
I was actually hoping for an initial bit of guidance about reconciling two seemingly at-odds use cases:
----
Use Case 1: Receive MIDI On/Off Notes from external USB MIDI
This seems relatively straightforward, lots and lots of examples. This will have the drone synth sit alongside Ableton Live and the NDLR with grace and ease.
Use Case 2: Receive MIDI Clock from external USB MIDI and internal Sequencer
This seems more involved, I feel like receiving the external MIDI clock and hooking that into the internal sequencer should be fine, the sequencer could then trigger notes using the code one from Use Case 1 etc.
----
Assuming I have Use Case 1 up and running, I am curious about how Use Case 2 would best be handled:
I'm sure these questions are all variations on the same chicken and egg situation going on in my head around this, can anyone shed some light on the best direction to head on this before I start putting together the code for either solution?
Any input is greatly appreciated and if I can provide further clarification on anything please let me know.
Many Thanks,
Michael
I am looking at putting together a drone synth using a Teensy 3.5. I am using 6 pots, 2 buttons, 1 rotary encoder and an 16x2 LCD to control the parameters of the various audio objects. I will also eventually be adding preset storage and recall. I have put together the hardware for this and it is working well. I am also comfortable putting the software together for this understandably gargantuan task and will be throwing the code and schematics up on github at https://github.com/mbryne/msynth and sharing progress as I go.
The synth will find itself home amongst a few other keyboard synths and Ableton Live and the NDLR super arpeggiator, the latter of which has a drone note playing on every down beat which has prompted me to put together a drone synth for low, ambient audio textures.
Alongside the above features and basic MIDI input I am also hoping to put together a rudimentary pattern editor based on something like the Adafruit FifteenStep Sequencer. It seems though that this and most sequencers are focused on triggering external instruments.
I was actually hoping for an initial bit of guidance about reconciling two seemingly at-odds use cases:
----
Use Case 1: Receive MIDI On/Off Notes from external USB MIDI
This seems relatively straightforward, lots and lots of examples. This will have the drone synth sit alongside Ableton Live and the NDLR with grace and ease.
Use Case 2: Receive MIDI Clock from external USB MIDI and internal Sequencer
This seems more involved, I feel like receiving the external MIDI clock and hooking that into the internal sequencer should be fine, the sequencer could then trigger notes using the code one from Use Case 1 etc.
----
Assuming I have Use Case 1 up and running, I am curious about how Use Case 2 would best be handled:
- Would the sequencer send MIDI notes from Case 2, to itself, using existing MIDI triggers from Case 1, on it's own channel?
- Would I disable the sequencer when not in use?
- Would I even need to take in a midi clock on Use Case 1?
- Would I always have the sequencer running, receive MIDI notes and trigger Use Case 1? (this seems inefficient)
I'm sure these questions are all variations on the same chicken and egg situation going on in my head around this, can anyone shed some light on the best direction to head on this before I start putting together the code for either solution?
Any input is greatly appreciated and if I can provide further clarification on anything please let me know.
Many Thanks,
Michael