SPI Library Issue w/ ILI9341 TFT & PN532 NFC Module on Teensy 3.2

Status
Not open for further replies.
Yes ! It would be even more useful with the comming teensy which has more flash. I found very interesting code that could be used, and it is compatible to gzip.
 
@Defragster: would'nt that be an interesting challenge for you? :)

My thoughts EXACTLY . . . an interesting challenge for you (FrankB)! I'm not bored anymore :)

I glanced at the code with my preconceived concern in mind: Where is the Buffer RAM coming from?

It could only work on files that would fit in a RAM buffer, as written it seems to open and unpack the decode trees then jump in to the compressed data and return/exit when it is all in the buffer.

Is this being limited to small files that fit completely in a provided buffer? Stream'ing would mean the class would HOLD the buffer for return in pieces?

It says it is small - 2K in code, that is a good thing, and it is all "C" code. All the Stream implementations were on things called "computers with infinite RAM" and they used a system provided decompress library?

To make it re-entrant for large/streamed files would take a full decode restart and maybe decode to a circular buffer that just decodes and counts bytes to a START position and stops when the buffer tail meets head after that?

<edit>:: Meant to make this note:
This code would work WELL on the ESP8266 where a ZIP file could be decompressed to a SPIFFS file (temp or permanent) and then read from there as a normal file.
 
Last edited:
That lib suggested 2K in code and also 1K in a RAM buffer ( it seemed for the decode trees ) - the RAM I was worried about was the for the whole of the decompressed file. You can't index into a compressed file 'randomly' as you have no idea how many blocks will 'explode' into how much RAM. In theory an open file could track that - but you would still have to re-enter to pick up where you left off either saving or re-creating the state.

It would be a bit of a messy addition to allow sequential reads with partial returns not just maintaining a simple open file pointer offset.
 
I was wondering what the needed 'zip'd files were going to be. JPG's only compress about 3-4%, not worth the code and RAM overhead.
 
Well, to compress a compressed file already again is of course meaningless.
JPG and GIF have each other types of compression. PNG a third. That means you have to write three different programs, and they still can not be used for other files.
Just one example: The current SID-player plays * .dmg files that are fairly well compressible. I could install a much longer file in the "Demo Sauce".
If one instead writes a standard (de-)compression, it is necessary todo this once only - and can use them for any type of data.
For images you do just a "zip" before a 565-RAW file, and you're done.
The image used above can reduce by about 30% ("Deflate").
This can always be done, if a sequential access is ok. For example for pictures or the above-mentioned * .dmg. I can also imagine many other uses more.

In addition, all this can made quite simple, if you, for example, only allow one file per container.

The 1 KB (?) RAM-Usage is only during de-compression, on the stack.
 
Last edited:
Isn't a GIF just a stacked series of JPG's? So a JPG would just be a GIF special case?

Oh yeah - I also intended to mention as I scanned the linked unzip code, it seems to be built for 'single file' - as I read the code it just jumped over any file name bytes and other header data to get to the compressed data - didn't seem to index to a named file.

FrankB - I knew there were 'some' files that were compressible you had noted recently. If my math is right a full 2 byte color screen is 150KB so a 30% saving would net about 45KB savings?
Assuming again from my code scan - where I saw the note - 'pass a buffer to save stack 1K growth' - that a Structure for all local data could be made to allow the unzip 'in progress' to be made 're-entrant' with a new 'Handle' to open file where that would hold the previously loaded decode trees and other block progress - and new byte offset - indicators. I don't know how big 'blocks' are and if the trees really only use 1K RAM, but if it could be decompressed BLOCK by BLOCK and the trees are limited in size it might get you a net gain in the end.
 
No, GIF uses an other compression. And it is lossless, not like JPG which is lossy. But GIF supports 256 colors max.
 
@defragster:

One thing worth noting (as it relates to the 30% size-reduction example I posted) is that the reduction does not seem to move linearly with initial file size (or type, for that matter).

In other words, I have taken a JPG that started at ~20kb in size and compressed to a ~6kb data array.

Again, I don't bring much to the conversation on specific compression algorithms, but in terms of practical use case, I have been able to create various GUIs as well as an assortment of interesting effects (at least for an MCU-driven TFT), with all assets stored in PROGMEM, that wouldn't have been possible without compression. And with no visible loss in quality.

@Frank: I see this discussion has shifted to the broader possibilities of incorporating (de)compression algorithms into the Teensy libraries, which is a great thing.

But I wonder if you had any thoughts as to why I was able to get the compressed bitmap method working in the ILI9341_t3 library, but not the text printing methods for antialiased fonts. I'm sure enough hours of hacking at it will eventually get it there, but I figured you might have a shortcut for me...
 
@plsco, i'm sorry, and i can't help in this case. Currently, i'm only reading this forum from time to time - i'm too busy with other, not hobby-related things.
 
I very much agree with this. I've written a high-level method that handles animated GIFs (you need to manually break them up into their constituent frames first, so it's not terribly user-friendly), but it noticeably lags once the size exceeds ~40 square pixels.

Personally, I'd love to see "object" memory support added to the ILI9341_t3. The difference between an amateurish and professional-looking display can be as simple as the momentary flickering one sees when you overwrite a region by first calling fillRect(region, BLACK) - as opposed to clearing the erased pixels and just overwriting those pixels getting new color values in one shot.
 
@plsco, i'm sorry, and i can't help in this case. Currently, i'm only reading this forum from time to time - i'm too busy with other, not hobby-related things.

Finally got the library working - thanks for all the feedback!
 
Another method of converting the images into the right format is using GIMP, I made a post about that yesterday: 35575-Export-for-ILI9341_t3-with-GIMP, it's a free cross-platform image editor. Additionally the output contains the necessary width and height of the image, making it quite convenient to work with. I've created a pullrequest to add an explanation on how to use GIMP for exporting in the example.
 
Status
Not open for further replies.
Back
Top