This may be a fundamental or "it depends" type question, but I haven't found any guidelines or benchmarking for the topic.
I am thinking about using a graphic display in my next project, something like a 128x32 graphic OLED display.
Basic use of the display is no problem, but I wondered how much overhead there is when also using a single MCU (Teensy 3.6) for the main code execution.
Do most designs with displays use two MCUs? One for main functionality (timing critical, etc), and one just for the display?
What usage patterns have the most overhead, animations, refresh rate, resolution?
As an example, say I want to receive an incoming analog signal and then display an oscilloscope waveform in realtime on the OLED...
I'm just looking for tips or places to begin doing some research on the topic.
JK
I am thinking about using a graphic display in my next project, something like a 128x32 graphic OLED display.
Basic use of the display is no problem, but I wondered how much overhead there is when also using a single MCU (Teensy 3.6) for the main code execution.
Do most designs with displays use two MCUs? One for main functionality (timing critical, etc), and one just for the display?
What usage patterns have the most overhead, animations, refresh rate, resolution?
As an example, say I want to receive an incoming analog signal and then display an oscilloscope waveform in realtime on the OLED...
I'm just looking for tips or places to begin doing some research on the topic.
JK