Dynamic Clock Speed of Teensy 4.0

Status
Not open for further replies.
Yes, it prints a message, and switches off after that :)

Thought I saw that noted ... so writing to RAM2 first would allow it to show on warm restart - but going to power down OFF losing RAM2? Maybe 'abuse' those RTC RAM bytes in case they are preserved?
 
I could check with the scope, but does anyone know how long set_arm_clock() takes to execute? I see that it includes wait loops. Also that it's changing voltage - which might have some side effects.

Right now, running with switching between 240Mhz and 600Mhz - this works OK and reduces temp by about 4C.
 
I could check with the scope, but does anyone know how long set_arm_clock() takes to execute? I see that it includes wait loops. Also that it's changing voltage - which might have some side effects.

Right now, running with switching between 240Mhz and 600Mhz - this works OK and reduces temp by about 4C.

It takes some measurable time to make the change in clock speed. Checked it once some time back - but forget the numbers. It does have embedded while( wait for CPU makes change ); that are not finite and depend perhaps on the nature of the change to voltage or clocks as the CPU keeps things running across the changes.

Adding a pin toggle or timing monitor to FranksB's posted code would show on scope or SerMon. Could use CYCCNT - but that changes across the change - not sure about time there either as micros() uses same CYCCNT against F_CPU_ACTUAL - so scope pin time would be best.
 
Looks like it doesn't disable interrupts while waiting - seems like a potential source of problems.
 
If I remember our testing correctly with temp and tempMon() it would hang before PANIC temp was reached, at least when we tested with the display attached. If was if some other part of the chip would fail before Panic temp. Forgot where we did that testing.
 
Disabling interrupts around the call made it much worse. My working assumption - changing speeds simply takes too long to be useful in my use case.
 
Disabling interrupts around the call made it much worse. My working assumption - changing speeds simply takes too long to be useful in my use case.
Quite possible.

I saw now that the ADC uses a timer that is connected to the Core-Clock:
Code:
const int comp1 = ((float)F_BUS_ACTUAL) / (AUDIO_SAMPLE_RATE_EXACT * 4.0f) / 2.0f + 0.5f;
So, this will result in problems when changing the clock.

It's possible that SPDIF3 uses IGP which is connected o the Core-clock too. Did not look at the sourcecode, now.
Can you just try another SPDIF? The emulation uses I2S (which in turn uses the Audio-PLL)

Good Night,
Frank
 
Status
Not open for further replies.
Back
Top