mborgerson
Well-known member
I have been working on a data logging program that operates a Teensy 3.6 at two different clock rates:
48MHz when connected to USB: The USB Serial port is used for the PC host interface and allows me to upload collected data at about 1MByte/second, which is sufficient for test files of a few MBytes. At this clock speed the logging program uses about 16mA when logging data and about 14mA during other operations.
8MHz when disconnected from USB. When USB is disconnected, the host user interface is directed to UART0, which runs at 57600 Baud. When logging 4 channels at 1.0KHz, the program draws about 5mA from the power supply.
The program is compiled for a system running at 48MHz. It can be switched by command, or by monitoring the USB 5V power (through a voltage divider) to 8MHz. The switch requires changing some system clock registers and adjusting the UART0 baud rate registers. Fortunately, SDFat 2.0b does not seem to object to the clock change---but it does take longer to write 32KB buffers to the SD card.
The major issue that crops up after the clock change is that interval timers and other timing functions no longer produce the proper delays as their code is compiled to use defined constants like F_BUS and F_CPU. As a result, an interval timer set up for 1.0KHz collection at 48MHz, will interrupt at 166.666Hz when the clock and bus run at 8MHz.
I see two possible ways to overcome this problem with intervalTimers:
1. Move the intervaltimer library code to my local libraries folder so that my modified code will be loaded instead of the code distributed with TeensyDuino. The modified code will add a member function like intervalTimer::setClock(uint32_t clockspeed). That function will set an internal clock speed variable that will be used in the calculation of the cycles variable that goes to the hardware timer. The variable will be initialized to F_Bus when the interval timer is instantiated, so no changes are required when the T3.6 stays at the speed specified by F_CPU or F_BUS.
2. Change all my calls to interval timers to calls to a shell function that adjusts the number of microseconds passed to the intervalTimer begin() function. This method means I don't have to mess with the Teensyduino-defined interval timer code. However, it also means that any library functions that depend on interval timers are going to run 6 times more slowly than desired.
The ideal solution, from the point of view of a designer of low-power loggers, would be to have the Teensy libraries become less dependent on the compile-time definitions of F_CPU and F_BUS. I realize that this would require either setClock() member functions for many objects or the rewriting of many functions to use a system variable that would be initialized to the value of F_BUS or F_CPU. Setting a new clock speed in your program without reinitializing clock speed dependent objects could lead to unexpected behavior (to say the least!). Given the possible problems and the probably low number of users really needing this change, I don't expect it to become a high-priority issue for Paul.
I will put together and post some code illustrating my solutions to the dynamic clocking issues sometime next week. I will need my oscilloscope to verify timing-related issues and I'm away from home this week.
You may ask, "Why are you so worried about a 10mA reduction in operating current?" If you are building a data logger to collect oceanographic data for one year while on a mooring in the equatorial Pacific, that reduction in current could increase your logger duration by many months, or you could reduce the number of lithium primary cells--and either reduce the pressure case volume or switch batteries to the analog supply that powers the sensors. (The cost savings resulting from using fewer batteries is not a major issue. The cruises that deploy and retrieve the loggers cost hundreds of thousands of dollars. Thankfully, those costs are shared among many projects and the cost to deploy a logger may only be a few thousand dollars.)
48MHz when connected to USB: The USB Serial port is used for the PC host interface and allows me to upload collected data at about 1MByte/second, which is sufficient for test files of a few MBytes. At this clock speed the logging program uses about 16mA when logging data and about 14mA during other operations.
8MHz when disconnected from USB. When USB is disconnected, the host user interface is directed to UART0, which runs at 57600 Baud. When logging 4 channels at 1.0KHz, the program draws about 5mA from the power supply.
The program is compiled for a system running at 48MHz. It can be switched by command, or by monitoring the USB 5V power (through a voltage divider) to 8MHz. The switch requires changing some system clock registers and adjusting the UART0 baud rate registers. Fortunately, SDFat 2.0b does not seem to object to the clock change---but it does take longer to write 32KB buffers to the SD card.
The major issue that crops up after the clock change is that interval timers and other timing functions no longer produce the proper delays as their code is compiled to use defined constants like F_BUS and F_CPU. As a result, an interval timer set up for 1.0KHz collection at 48MHz, will interrupt at 166.666Hz when the clock and bus run at 8MHz.
I see two possible ways to overcome this problem with intervalTimers:
1. Move the intervaltimer library code to my local libraries folder so that my modified code will be loaded instead of the code distributed with TeensyDuino. The modified code will add a member function like intervalTimer::setClock(uint32_t clockspeed). That function will set an internal clock speed variable that will be used in the calculation of the cycles variable that goes to the hardware timer. The variable will be initialized to F_Bus when the interval timer is instantiated, so no changes are required when the T3.6 stays at the speed specified by F_CPU or F_BUS.
2. Change all my calls to interval timers to calls to a shell function that adjusts the number of microseconds passed to the intervalTimer begin() function. This method means I don't have to mess with the Teensyduino-defined interval timer code. However, it also means that any library functions that depend on interval timers are going to run 6 times more slowly than desired.
The ideal solution, from the point of view of a designer of low-power loggers, would be to have the Teensy libraries become less dependent on the compile-time definitions of F_CPU and F_BUS. I realize that this would require either setClock() member functions for many objects or the rewriting of many functions to use a system variable that would be initialized to the value of F_BUS or F_CPU. Setting a new clock speed in your program without reinitializing clock speed dependent objects could lead to unexpected behavior (to say the least!). Given the possible problems and the probably low number of users really needing this change, I don't expect it to become a high-priority issue for Paul.
I will put together and post some code illustrating my solutions to the dynamic clocking issues sometime next week. I will need my oscilloscope to verify timing-related issues and I'm away from home this week.
You may ask, "Why are you so worried about a 10mA reduction in operating current?" If you are building a data logger to collect oceanographic data for one year while on a mooring in the equatorial Pacific, that reduction in current could increase your logger duration by many months, or you could reduce the number of lithium primary cells--and either reduce the pressure case volume or switch batteries to the analog supply that powers the sensors. (The cost savings resulting from using fewer batteries is not a major issue. The cruises that deploy and retrieve the loggers cost hundreds of thousands of dollars. Thankfully, those costs are shared among many projects and the cost to deploy a logger may only be a few thousand dollars.)