TeensyMix Synth - DIY 8 Voice Synthesizer

phjanderson

Member
Hi guys,

I finally managed to finish and release my DIY synthesizer project, hope it is of use or inspiration to anyone!

TeensyMix synth is an 8-voice polyphonic synthesizer based on the Teensy 4.1 development board. It is designed to be an easy build with just an LCD display and a digital to analog converter. An Akai MIDImix is used to control the synthesizer. The MIDImix is a compact and affordable MIDI based mixing controller with 24 dials, 9 sliders and 19 buttons with LEDs.

All sound parameters of the synthesizer can be controlled directly using these dials, sliders and buttons. No complicated menus and pages of parameters to scroll through!

See and listen to what it looks and sounds like!

Source code and schematics are available on GitHub.

Features:
  • 8-voices with sawtooth, square, sine, triangle, sample & hold noise and several sampled waveforms
  • 2 oscillators per voice, one with waveform shape modulation, one with frequency modulation (FM)
  • Oscillator 1 supports unison detune with a total of 7 stereo panned oscillators with adjustable detune and mix levels
  • 2 12db filters, individually controllable, one with adjustable resonance, allows various combinations of low-pass and high-pass
  • 2 envelope generators, one for amplitude, one for modulating the filter, shape, etc
  • 1 LFO for modulating the frequency, filter, shape, etc
  • Per voice wavefolder for manipulating the waveform
  • Per voice waveshaper for distortion
  • Stereo ensemble chorus
  • Stereo audio using a 24 bit digital to analog converter
  • MIDI over USB
  • Optional: (5 pin DIN) serial MIDI
Special thanks go out to the guys at PJRC for making the Teensy 4.1 development board and Audio Library. It is one of the most powerful development boards on the market. I hope they release an even faster board in the future so I can add more voices and features ;)
 
Nice project, I have a question, why are the waveform tables sized [257] while the audio library oscillators expect [256]?
 
Nice project, I have a question, why are the waveform tables sized [257] while the audio library oscillators expect [256]?
I took these waveforms from AKWF-WaveForms. His README.md mentions:
All samples are resampled From D2+2 (which equals 600 samples) to 256+1 Samples for use on the Teensy 3.1. These audio waveforms have a period of 256 points, plus a 257th point that is a duplicate of the first point. This duplicate is needed because the waveform generator uses linear interpolation between each point and the next point in the waveform.

But indeed the documentation of the Teensy Audio Library mentions:
Configure the waveform to be used with WAVEFORM_ARBITRARY. Array must be an array of 256 samples.

Not sure if the guy behind AKWF-WaveForms used these with something other than the Teensy Audio Libary that had this requirement. A quick glance makes me believe that the (current) Teensy Audio Library does not need this duplicate 257th point, but it won't cause any problem either. In that case I wasted 2 bytes per waveform ;)

There are many more waveforms in that GitHub repo, I just picked a few that seemed interesting.
 
I am myself currently developing a very similar project, which uses an Akai MPK miniplus with eight encoders as a controller, a stg5000 card and an ili9488 screen.
 
I am myself currently developing a very similar project, which uses an Akai MPK miniplus with eight encoders as a controller, a stg5000 card and an ili9488 screen.
I think those kind of controllers make an interesting combination with Teensies and such. Saves a lot of work compared to building your own controls. Particularly good for lazy people like me ;)

The nice thing about the MIDImix is that you get quite a lot of controls in a reasonably small panel. I reused the buttons for controlling things like patch loading and saving, which made the synth build itself completely button free.

I used a 2nd hand Nektar SE25 connected to the Teensy USB host port through a hub, together with the MIDImix, during development as I didn't have a real keyboard handy near that computer at that time. Connecting multiple controller devices could open even more opportunities, but it does seem to be somewhat tricky to determine which USB device is which controller. I couldn't figure out a way to do that and I saw someone else post a question about this as well. It's currently not really an issue for the TeensyMix Synth as using a keyboard connected to the USB host connector is not an intended use case anyway. The USB connection to a PC (with a DAW) or the 5-pin DIN MIDI should be used for this purpose instead.
 
I am myself currently developing a very similar project, which uses an Akai MPK miniplus with eight encoders as a controller, a stg5000 card and an ili9488 screen.

To query many encoders, I use a small I2C interface for each encoder. Up to 256 encoders can be queried. It saves valuable computing time on the Teensy and is easy to query.

My video

Link: https://github.com/wagiminator/ATtiny412-I2C-Rotary-Encoder
 
I think those kind of controllers make an interesting combination with Teensies and such. Saves a lot of work compared to building your own controls. Particularly good for lazy people like me ;)

The nice thing about the MIDImix is that you get quite a lot of controls in a reasonably small panel. I reused the buttons for controlling things like patch loading and saving, which made the synth build itself completely button free.

I used a 2nd hand Nektar SE25 connected to the Teensy USB host port through a hub, together with the MIDImix, during development as I didn't have a real keyboard handy near that computer at that time. Connecting multiple controller devices could open even more opportunities, but it does seem to be somewhat tricky to determine which USB device is which controller. I couldn't figure out a way to do that and I saw someone else post a question about this as well. It's currently not really an issue for the TeensyMix Synth as using a keyboard connected to the USB host connector is not an intended use case anyway. The USB connection to a PC (with a DAW) or the 5-pin DIN MIDI should be used for this purpose instead.

For info, there are primitives:
midihost.idVendor()
midihost.idProduct()

with for example:
idVendor Akai = 0x9E8
idProduct Midi Mix = 0x31
idProduct MPK Mini Plus = 0x54

idVendor BEHRINGER = 0x1397
idProduct X TOUCH MINI = 0xB3

idVendor fighter_twister = 0x2580
idProduct fighter_twister = 0x7

It works well with this type of code:

void USB_HOST_DETECT()
// Display information about the connected device
{
for (byte i = 0; i < 4; i++)
{
if (midihost.idVendor() != 0) {
uint16_t vendorID = midihost.idVendor(); // ID du fabricant
uint16_t productID = midihost.idProduct(); // ID du produit
const char* productName = midihost.product();
const char* constructeur = midihost.manufacturer();

// Assigning a USB HOST port to a device
// In order to be able to manage it regardless of its place on the HUB
if (vendorID == 0x9E8 && productID == 0x54) mpk_mini_plus = i;
if (vendorID == 0x1397 && productID == 0xB3) x_touch_mini = i;
if (vendorID == 0x9E8 && productID == 0x31) midi_mix = i;
if (vendorID == 0x2580 && productID == 0x7) fighter_twister = i;
}
}
}
 
Last edited:
Nice project :) The next step for your synth would be manual operation without a midi controller.

My Teens Synth "Jeannie"
Video 1:
Video 2:

Teensy project: https://forum.pjrc.com/index.php?threads/new-teensy-4-1-diy-synthesizer.63255/
Thanks!

Your projects are truly a marvel! I looked at several of them in the past. Thanks for sharing them!

The main goal of building it with the MIDImix is to have many controls and have every aspect of the voice directly accessible without any menus. I originally wrote code for a much more complex synth, but decided to remove features until it fit on the number of dials and buttons I had on the MIDImix.

I used to have a Juno-60 and used several other synths from that era. I remember that ones I liked the most were the ones with dials and buttons for every control, without menus. Also I wanted it to be an easy build, as not everyone has the time or expertise to make something with a lot of custom hardware.

I have to admit though that a full custom build does look nicer and you can make a button / dial layout that is more intuitive.

To query many encoders, I use a small I2C interface for each encoder. Up to 256 encoders can be queried. It saves valuable computing time on the Teensy and is easy to query.

My video

Link: https://github.com/wagiminator/ATtiny412-I2C-Rotary-Encoder

Actually I started my experiment with I2C using:
I initially had some difficulties due to bugs in the Teensy Wire.h implementation but that was solved eventually:
https://forum.pjrc.com/index.php?threads/teensy-4-1-i2c-issues-when-using-audio.72793/

I found that scanning the I2C controllers costed way too much processor time though. I did some time measurements with these controllers and came to the conclusion that extending this to say 40-60 controllers would cost a significant chunk of the CPU time which I really needed for the actual sound generation. I considered using a separate controller (Arduino perhaps?) to offload the I2C scanning and turn that into something that is just an "event on change" for the main Teensy. But then I ran into the MIDImix and decided to continue with that route instead.

It wouldn't be difficult for others to adapt my code to use other types of controls however. Everything is a "parameter" in the code and parameters are linked to Control Changes on the "external MIDI" and Control Changes or Notes (for up/down increment buttons) on the "controller MIDI" side.

Anyway, I hope my little project can insipre others, just like your marvelous project insprired me and many others!

Grüße aus den Niederlanden!
 
For info, there are primitives:
midihost.idVendor()
midihost.idProduct()

Indeed! I see they inherit these from the USBDriver parent class.

I guess what I was hoping to find at that time is to be able to influence what USB device is assigned to which variable:
C++:
static MIDIDevice_BigBuffer usbHostMidi1(usbHost);
static MIDIDevice_BigBuffer usbHostMidi2(usbHost);
Perhaps by being able specify the vendor and product IDs of the device that you wish to bind. Currently it just assigns things more or less at random.

But it would be easy to just assign all USB MIDI devices to several variables and then check each of those for their vendor/product IDs and assign them to new (lists of) variables based on the specific device types.

Thanks for the tip!

Perhaps I will adjust the code later on to only have the MIDImix be handled as "controller MIDI" and handle other USB MIDI devices as "external MIDI".
 
I thought this looked cool and easy enough to build for my kid.
I built it, followed all instructions, making sure to do the necessary audio library modifications as well,

Everything seems to work

EXCEPT there is no sound?

I did the DAC guitar strum test sketch, it works. The DAC is working and producing audio.

The lcd is working and responding to everything I do on the midimix. The program and hardware all seem to work, but I don’t know how you’re supposed to get it to produce any sound whatsoever? I’ve done patch selection, patch save.. etc…

Is there a “play” mode i’m not aware of? Is there something I’ve missed? Please help!
 
Just had a quick look at your repo, and I'm slightly curious about one thing. You say you need to make the audio blocks just 16 samples to reduce latency, which involves a bit of editing in cores. No major problem with that, but it's a reasonably unusual requirement for a synth engine. On looking closer, I can see you've apparently defined objects in Synth.h in nearly the worst possible order, from last in the signal chain through to first. The update() sequence is set at construction time, with each new object linking itself in at the end of the list. For each "reversed link" in this chain you will add one block time of latency - I think there's at least 3 layers of reversed links, there could be more.

See the advice on this page: "Audio objects should be created in the order data is processed..." etc. Note that the order of creation of AudioConnection objects is irrelevant.

You might want to test what you already have, using standard 128-sample blocks, then do a bit of re-ordering and see if you can get a latency reduction big enough to remove the need for changing the block size. You will also reduce the number of memory blocks needed, as every reversed link that's transmitting audio will hold on to an allocated audio block from one update to the next.
 
I thought this looked cool and easy enough to build for my kid.
I built it, followed all instructions, making sure to do the necessary audio library modifications as well,

Everything seems to work

EXCEPT there is no sound?

I did the DAC guitar strum test sketch, it works. The DAC is working and producing audio.

The lcd is working and responding to everything I do on the midimix. The program and hardware all seem to work, but I don’t know how you’re supposed to get it to produce any sound whatsoever? I’ve done patch selection, patch save.. etc…

Is there a “play” mode i’m not aware of? Is there something I’ve missed? Please help!
What are you using to play notes? It responds to notes coming in from the hardware MIDI input (from a MIDI keyboard for example) or the USB MIDI connection to a PC (running a DAW for example). I should add links to the example code to test these as well. Will do that later as I do not have access to a computer right now.

It also responds to notes coming in from a USB MIDI keyboard connected to the USB host connector (used for the MIDI mix), though that has some limitations at this moment.
 
Just had a quick look at your repo, and I'm slightly curious about one thing. You say you need to make the audio blocks just 16 samples to reduce latency, which involves a bit of editing in cores. No major problem with that, but it's a reasonably unusual requirement for a synth engine. On looking closer, I can see you've apparently defined objects in Synth.h in nearly the worst possible order, from last in the signal chain through to first. The update() sequence is set at construction time, with each new object linking itself in at the end of the list. For each "reversed link" in this chain you will add one block time of latency - I think there's at least 3 layers of reversed links, there could be more.

See the advice on this page: "Audio objects should be created in the order data is processed..." etc. Note that the order of creation of AudioConnection objects is irrelevant.

You might want to test what you already have, using standard 128-sample blocks, then do a bit of re-ordering and see if you can get a latency reduction big enough to remove the need for changing the block size. You will also reduce the number of memory blocks needed, as every reversed link that's transmitting audio will hold on to an allocated audio block from one update to the next.
Thanks for the tips! Will look into it later on. No access to a computer and the teensy at this moment.
 
What are you using to play notes? It responds to notes coming in from the hardware MIDI input (from a MIDI keyboard for example) or the USB MIDI connection to a PC (running a DAW for example). I should add links to the example code to test these as well. Will do that later as I do not have access to a computer right now.

It also responds to notes coming in from a USB MIDI keyboard connected to the USB host connector (used for the MIDI mix), though that has some limitations at this moment.
Thanks for the response! I was under the impression that this somehow played notes through button presses on the midimix. I do have midi keyboards and other devices, but they are all USB.

I haven’t done this type of stuff for quite some time, and while I do have some experience, it’s mostly limited to following instructions like yours and assembling something with specific directions telling me what to do 😅

So, to send notes to the synth..it sounds like my options are either to add a MIDI interface (can it be USB, not 5 pin?) to the teensy circuit, or add a USB hub at the HOST connector? I’m going to try the USB hub, I’ll report back.

Again, thank you for your reply, and more importantly, thank you for this project!
 
What are you using to play notes? It responds to notes coming in from the hardware MIDI input (from a MIDI keyboard for example) or the USB MIDI connection to a PC (running a DAW for example). I should add links to the example code to test these as well. Will do that later as I do not have access to a computer right now.

It also responds to notes coming in from a USB MIDI keyboard connected to the USB host connector (used for the MIDI mix), though that has some limitations at this moment.
Okay, this worked. I had to improvise a bunch, I’ll need to get some additional equipment to tidy it all up, but not we’ve got sound and it’s working as intended! Thank you!
 
Back
Top