Teensy 3.1 High Speed (>100kS/s AO+AI) Data Acquisition System over USB?

Status
Not open for further replies.

aspence

Member
Hullo all!

I run a bioengineering research group in animal locomotion and we're having loads of fun building kit with the teensy 3.1 -- mostly using it do drive high speed video camera frame grabs and synchronize other digital triggers. We're at www.spencelab.com

But i'm increasingly frustrating with having to use LabView on Windows for data acquisition: i would love a solution for data acquisition i can run under Ubuntu. And cheap is always good. No GUI or a simple tk interface with Matplotlib graphics (e.g. written in Python) would be great.

I'm wondering, could you make a data acquisition system with the teensy 3.1 that does, at minimum, one analog input at at least 100 kS/s, and one analog output at at least 100 kS/s, over the USB? Great if it could just be done on the teensy; but an external ADC and DAC are certainly an option if needed, as would be FIFO memory; but at this point if you are creeping up towards $300 or $500 it could be time to just get a comedilib compatible data acq NI PCI board, or another vendor's Linux compatible data acq board.

But the fast USB and power of the teensy made me think it might not be crazy to think about using one to do high speed DAQ.

Thanks for your time and input!
Andrew Spence
 
Details matter, but it's probably feasible.

Your idea is likely possible with a bit of planning and a willingness to use chip's resources efficiently. I am currently using a T3.1 to drive a 100 kHz signal for a semi-fancy application-specific test signal generator. The whole thing has been done without difficulty and with essentially no code optimization. I haven't tried timed ADC but hope to soon, as the product we're modeling will depend on both capabilities. The K20 manual implies that the DAC and ADC can driven such that the ADC and DAC can be triggered by the same timer, preserving the phase relationship between input and output.

My project uses the DAC's 16-entry output buffer which is reloaded from a memory buffer by a simple interrupt service routine. The DAC is triggered by the PDB running at 100 kHz. The primary focus of my little project is test pattern generation, all of which is computed onboard based on configuration information provided by the supervising host. Nearly all of the sophistication (such as it is :) ) is focused on the details of the test pattern generation.

The entire project is under 1000 lines of C code, and the DAC driver is under 100 lines. Total development time is probably about 32 hours spread over the last month, incrementally improving the pattern generator as needed. This little system has improved our ability to characterize our industrial data acquisiton product with greater precision and resolution than ever before.

I think the T3.1 is an ideal platform for DAQ, likely to do the job well.
 
The T3.1 has a somewhat high noise floor on the ADC. I don't doubt you could get 100kS/s out of a Teensy 3.1 though, if the noise isn't a huge concern.

If you're considering inexpensive non-conventional DAQ solutions, you might want to take a look at the Saleae Logic series as well (the logic 4 is $99). https://www.saleae.com/?gclid=CJL5iYDxpMECFW4F7AodRBAAuw#PricingTile Saleae's linux support is great, and even the cheapest model is capable of 6MS/s on 1 analog channel.



food for thought, anyway.
 
If I recall correctly, the ADC can achieve 100KS/s.
Some/most MCUs can be put to "sleep" during the ADC cycle (100uSec) to reduce noise.
 
Wow! That was perhaps the fastest and best feedback I've had on the interwebs in some time. Thank you Len and all!

Well give it a shot, and try to phase lock adv and dac as you suggest. Great.

And thanks for the selea suggestion. Is that what all the screenshots I keep seeing on the forum of digital and analog waveforms are from? Looks better than bitscope, and cheaper!

Somewhere I think I found that t3.1 adc is limited to 320ks/s. Well see.

Thanks all!
Andreq
 
Hi Aspence,

I prefer using make && commandline downloader, and implemented almost all of the project as a small set of C-language files.

This is from the perspective of our orgainzation's management a side project which is supportive rather than direct software/firmware production, so I"m limited in the amount of time I can spend setting up a proper directory & makefile hierarchy. Paul's libraries and the Arduino infrastructure are so convenient that I cannot justify going independent for such a small project.

That being said, here is what I have done to allow quick and easy startup while preparing for a more classical embedded style of development:

  • Implement a simple outer wrapper in C++ with the classical Arduino setup() and loop() which invoke "extern "C"" implementations of functional application logic.
  • Implement much of the aplication logic using a classical state-machine / async style
  • (If more comfortable with C for embedded work) Implement the application in portable C
  • Try to emphasize to the parts of the libraries (sniprintf, etc.) that are implemented in C in preference to the C++ if most of the app is C coded
  • Use the Arduino option to use external editor
  • The Teensyduino work that Paul did is surprisingly convenient for quick start

The libraries are really tempting to get an early start, which probably makes sense when deciding whether the chip's capabilities match your project's needs. I have found that the inevitable dilemmas are easier to resolve if one hasn't put too many eggs in a particular basket.

I think some of the other comments in this thread are worth noting too, particularly with respect to the noise floor in the K20 ADC. It is a reasonably good 12-bit A/D if its input is low impedance (mitigating but not eliminating the effects of on-chip noise), which matches the 12-bit resolution of the DAC. Limiting the number of bits in the ADC conversion is a good way to keep it up to speed too.

Forgot to mention earlier, which should be an important bullet point: I did implement the application using bare-metal driving of the DAC, PDB, interrupt service, mostly to allow portability if necessary. The background and IRQ levels are connected together with C structures defining DAC buffer controls including a few strategically defined callbacks. Perhaps I might find a way to post a few snippets of some of the less ugly prototype-quality DAC code.. It would be better to look over the likely higher-quality audio library code that's already available.
 
Last edited:
My recollection from past work with the ADC is that 140+ksps is entirely possible using standard speeds for the ADC (i.e. the 6MHz clock). My recommendation though would be to go for a dedicated ADC that is specialized for the application, if possible. Reason being that the ADC inside the Teensy has been shown to be susceptible to a number of influences that may or may not be hard to overcome.

For example, it is good practice to have a source impedance below about 2kOhm, a 0.01uF cap from analog in to AGND, and some rather large caps around the AREF supply to help stabilize the ADC power supply. Even USB transactions can lead a stock Teensy 3 to experience a significant enough supply voltage dip to influence 3.3v-based ADC measurements. High quality measurements likely require an external reference shunt to produce a stable AREF input voltage somewhere south of 3VDC. You may even need a op-amp as a buffer, especially if the signal is bipolar, etc.

So I would certainly experiment and see how well the Teensy does by itself, but I would also pursue a known-good path in parallel with a recognized ADC for which there may already be libraries as high-speed, high-quality measurements usually require external components for a reason. However, the Teensy ADC can certainly act as a fantastic proof of concept. If it's sufficient, awesome, it makes your life easier and reduces the implementation cost. However, developing an external ADC option is a good insurance policy. Many ADCs are available on ebay and other sites in breakout form, making it that much easier to experiment.
 
Last edited:
Hi Len and Constantin,

Thanks again for incredibly time saving direction. Much appreciated.

I have a related follow up: is PC - uc clock synchronization basically a solved problem? I've seen example library code that make clock sync calls on the micro using time data sent over the usb from the pc.

After doing this, can you time stamp things or start timers on the micro (eg adc samples) to within, say, a microsecond in the PCs time base?

I ask because we have multiple PCs running cameras and other devices all capturing the same action , with clocks synchronize d by ptpd over Ethernet and could use a common time on the micro. I'm guessing you might need code as developed by Andrew Straw with his camtrig device as part of the motmot tools to get this level of sync. Eg sending and receiving a time stamp requ est and assuming equal transit / receive lag. But once you have it you coudld specify the start time of adc + daq in pc time and the micro sets its timers to match

Thanks again!
Andrew
 
There are a number of examples out there of Arduino and Teensy programs executing code based on inputs from the user. For example, one of the DS3231 programs I saw in the past allowed the user to program the time using the Serial port (aka USB on the Teensy 3). Your PC could send a single character to initiate the conversion process, though I wonder how consistent USB is in this regard... i.e. can you rely on very good timing in a multithreaded environment.

If I were you, I'd consider setting up your own NTP server that uses a GPS receiver as a reference. Or use a GPS receiver and multiplex its PPS signal as a means of synchronizing all clocks once a second. That is, everyone restarts their internal clocks once they detect the rising or falling SQW trigger.
 
Saleae's linux support is great

Really? I just looked at their website and it seems they _still_ have the terribly buggy 1.1.15 version as the only non-beta download for Linux.

I bought the Logic16 years ago. I'm sure it works better on Windows, but I've been deeply disappointed by Saleae's Linux support.
 
Microsecond sync between devices would be quite challenging.

Over USB, I don't want to say "impossible", but perhaps "impractical" would be the word to use. Using ordinary data, it's tough to get more accurate than about the 1ms USB frame time. The USB start-of-frame packet is the one thing in USB that does have high accuracy, timing-wise. On the Teensy side, if you hack the USB code, you can get pretty low-latency response to SOF. I have no idea what mechanism is available on Linux to sync timing to the SOF packet.

If you're writing the PC-side code, another simple option (on the Teensy side) that might be worth considering could involve capturing Teensy's time in response to a pulse. Using attachInterrupt() or similar could let you do this very easily. If all your devices can capture their system time with microsecond precision, maybe the PC-side software could work out the timing of each stream "after the fact", perhaps using linear interpolation if devices were found to be sampling out of phase with each other?
 
I'm with Paul on this one. The project I'm working on will allow us to use a "root controller" (T3) to serve as the center of a star-topology network for a medium-size set of sensor controllers. I'm guessing thats the only really predictable way to achieve our own goals sub-microsecond time synchronization. We are not planning for the host to be synchronized with the sensor network.

Our application can tolerate thinking of the host as being loosely connected to the sensor network, such that events (or streams of events) that it receives are timestamped from the perspective of the root controller. Similarly, any commands / controls sent by the host that contain trigger times express those timestamps also from the perspective of the root controller.

I've been meditating on how best to achive reliable, tight synchronization of the elements of the sensor network with the root controller. For now, I'm hoping that a simple "time-sync" broadcast from the root controller containing its time could be used by the networked sensor controllers. From there it's possible to derive time-reporting protocol that is both accurate enough for our needs and doesn't overload the network with time-computation duties.
 
Hey Paul and Len!

Re: sync, have a look at this:
http://code.astraw.com/projects/motmot/camtrig/OVERVIEW.html

I think it is basically what Paul suggests, but with the Teensy time queried over USB from the PC. Only short latency round trip replies are kept (see python code https://github.com/astraw/motmot-camtrig), and the assumption is made that transmit and and recieve latencies are identical (roughly the interpolation you suggest).

Once you have a gain and offset model of the clock difference, assuming no very wierd behavior of either clock, both sides should be able to trigger and time stamp things in each others time base quite accurately, i think.

Plan is to blink an LED at known master PC clock time, drive all cameras from the same teensy's trigger pulse train, and then record with several cameras across of a few PCs that are saving camera data from the cameras being driven by the central teensy; if the PC clocks are close to sync thanks to PTPd and the cameras are driven with the trigger strobe, then all PCs should stamp the LED as coming on at roughly the unified clock time, barring dropped frames that happen.

Probably going to work with USB3 cameras from Ximea. They have a 640x480 mono one that does 500 fps streaming! Demo'd it and it works.

Cheers
A
 
Really? I just looked at their website and it seems they _still_ have the terribly buggy 1.1.15 version as the only non-beta download for Linux.

I bought the Logic16 years ago. I'm sure it works better on Windows, but I've been deeply disappointed by Saleae's Linux support.

I guess I don't really count the "beta" status of the linux builds against them... when beta 1.1.18 came out they promoted it over 1.1.15 GA as the preferred version for anyone on linux (or at least anyone on ubuntu 12.04+). I didn't have any problems with 1.1.18. As far as I can tell, they ARE testing the linux builds - they've fixed a few minor distro-related issues and permission issues since 1.1.18. They're just being transparent about the level of QA validation they're willing to put against Linux vs Mac/Win by leaving the "beta" label on there.

Compare this to, say, DipTrace who claim to support linux and then point you at Wine. Or eagle, whose linux support is "it seems to have compiled OK - try it out for us!" (I've had eagle crash xorg in completely unique and exciting ways that no other application ever has).
 
Status
Not open for further replies.
Back
Top