High Speed Interrupting

Status
Not open for further replies.

blittled

Member
I have been entertaining the thought of using a Teensy 3.5 as a replacement for the custom ULA IC found in the ZX81 computer. The ZX81 used the control lines of the Z80 to do the video generation rather than using a dedicated video chip. This requires constant monitoring of these lines.

I feel the best approach is to use an interrupt at each clock pulse and get the state of the control lines and address bus and save them to act on them in a main processing loop. With a 3.25 MHz Z80 clock my concern is locking up the Teensy. using most of its time in the interrupt routine. Also I would probably need to use assembly code to optimize routines for time.

Is this feasible or is there another approach that is fast enough to handle it? Thanks.
 
if you overclock the 3.5 to 168 MHz, you'll have ~50 Teensy-cpu-cycles per ZX81 cycle (GREAT computer by the way..was my first!)
If you've good programming-skills and code the interrupt as fast as possible, it might work... but it'll be a challenge.
Do you want to replace the display?
Can you tell more about your project? What does the custom ULA IC do, besides video generation?
Perhaps it's easier+faster to emulate the whole thing (including cpu) on the Teensy 3.5 ...

edit: Would be cool with the original keyboard.
 
Last edited:
I haven't given it much thought since I've been dealing with family matters recently. I may go with a Z80 Emulator on chip. That would be easier. I am a programmer with an Electrical Engineering degree and programmed in Z80 ASM, microchip, C, and C#. So coding I'd be up to coding it as an ULA. I'm also a member of a ZX81 forum and they are very concerned with the 5V CLPDs they use as ULA replacements are being replaced by 3V CLPD. So I was thinking of using the 3,5 as a replacement. I do have about 12 Z80 chips laying around, a replica keyboard and 3 ZX81s :)
 
On the topic of interrupt service and CPU performance... The K64 in the Teensy3.5 is an amazing performer. I've been using FRDM K64F devboards (with 10/100 Ethernet and a little IP stack) to support crazy fast communication: UART0 and UART1 are set for maximum RX FIFO size, and bulk data transfer is managed by the eDMA controller. Data flow with a host uses the Ethernet. It uses the Rx idle interrupt to detect packet boundaries and switch Rx buffers as quickly as possible... Maintaining packet boundaries this way simplifies lots of other comms layering issues. (We have debated whether this approach is optimal, versus just using circular buffering...)

Running both UARTs flat-out, pushing data to the host using UDP/IP, we saw the strangest behavior: On occasion, we would see packet errors. For way too long, I blamed the Ethernet driver for the strange packet behavior. We eventually discovered the actual failure pattern: one packet would be a byte short and the next packet would be a byte long. Way subtle to diagnose and shockingly straightforward to fix: I had to place a two-line loop in the Rx idle interrupt service function to wait for the RX FIFO to be emptied by the DMA controller before switching packet buffers. The K64F entered and began to run the interrupt service function faster than the DMA controller could clear the RX FIFO. We suspect that the Ethernet, with its own DMA, consumes enough memory bandwidth in a bursty pattern to hold off the eDMA Kinetis block.

I've been doing this sort of work for 40+ years and have NEVER seen a CPU outrun a DMA controller!

(This, BTW is why I wish I had time to set up a proper Ethernet phy peripheral for the T3.5: The Teensy would be a far better match to our physical form factor requirements than the FRDM boards, but we have other fish to fry...)

There's a good chance the K64F could do the ULA emulation job, though the code will likely need to be very carefully optimized.
 
Status
Not open for further replies.
Back
Top