I'm trying my hand at a spectrum analyzer. I started with the SpectrumAnalyzer sample code (https://github.com/PaulStoffregen/Oc...ectrumAnalyzer) so it has a lot of the original code still in it.

Currently I'm working on getting the LEDs to "slide" to each new height rather than simply flashing.
MY CODE: https://github.com/moomerator/Audio-...zer_Slide2.ino
MY CIRCUIT: pins specified for OCTO going to corresponding LEDs with 100Ohm resistors between each pin/signal. I'm running a teensy 3.2 for controlling the LEDs and powering them off of a 5V 20A PS.
I can make a circuit diagram if that may help but I figure it's basic enough that it's likely unnecessary in this case.

I'm running into two dilemmas:
1) With both the original code (only modification being to use USB for input) and my current code I'm getting occasional 'off' pixels between the 'base' and 'peak' pixels for random columns; the pixels that don't light seem entirely random (for example, sometimes pixels 1,2,4,5,6 will go with 3 dark and sometimes 1,2,3,6 will go with 4 and 5 dark in the same row, it happens sporadically with all of the rows). The analyzer still looks cool but it happens frequently enough to be distracting.

2) The sliding effect that I'd like would require a delay of some sort (similar to how BasicTest [https://github.com/PaulStoffregen/Oc...les/BasicTest] slides) but I was hoping to use millis to keep everything running smoothly but I'm not sure exactly how/if I could incorporate it in this case. I realize this may be a more general coding question so sorry if it's simple but any help would be very appreciated.

Most of my coding background is based on data analysis in Python/Matlab so sorry in advance if my questions come across as ignorant or if my code is particularly ugly.