Loop timing problem

Status
Not open for further replies.

SteveSFX

Well-known member
Hi

Still working on my DMX project, but I thought a fresh thread for this question would be correct

My main loop runs, samples sensors, reads and write DMX commands. All going fine.

However, I would now like to add the facility to ramp my DMX values up or down over a set period of time. Say for example 0 to 200 over 10 seconds.

My main loop however can vary in length, depending on whether there are certain sensors being read, or buttons being pressed.

I am currently issuing my updated DMX value every 100ms using a millis() based timer, and work out the value to send by simply dividing the target into the time required to get there (does that make sense)?
It worked fine until too much stuff got put into the main loop and it's started missing the 100ms spacing.
I don't want to make the spacing too large, as the values need to appear as smooth as possible.

I would like (if possible) to sample the current speed of the main loop in say micros(), and adjust my math so that I approx hit my target value over the correct time space.

Do you think this approach is feasible?
 
One possibility: define a variable [ start_millis = millis() ] to store the millis() value at the start of your ramp period. After each 100ms delay(), calculate how many millis have actually passed [ millis() - start_millis ] & use that actual value to calculate the change in your DMX values. For example, if your elapsed millis calculation shows that 102 ms actually transpired during your first 100ms delay(), then make your adjustment for 102 divided by the total ramp time. If the next 100ms loop shows 201 ms have transpired, then use 201 divided by the total ramp time for your DMX value calculations. Do this after each 100ms delay() until the full ramp period has been executed.

Hope that helps . . .

Mark J Culross
KD5RXT
 
Status
Not open for further replies.
Back
Top