To level set, I'm aware of the T4 thread on security, but rather than hijack that deep discussion, had a few questions regarding a low-volume (in the dozens), limited run, closed source application for industrial use.

Assuming T4 code security is not yet implemented:

1. If I ship a device with my firmware installed, would a bad actor be able to extract it and install on a fresh device? Assuming they don't remove any chips from the T4 to connect a JTAG.

2. If I email a .hex file for the Teensy 4, would a bad actor be able to decompile it easily?

2a. Is it possible to generate a .exe file with the .hex file embedded?

3. If code security gets implemented on the T4, at that point would I be able to freely distribute .hex files for firmware updates? Or is it likely that the T4 bootloader chip would need to be replaced, so requiring a new Teensy.

4. If I add an authentication check at startup to verify the existence of "something special" on my hardware, would that be hard to bypass by modifying the code on the hex file?

5. If that "something special" should be encrypted, is there a crytpo library that uses the iMX's Cryptographic Acceleration in the works or what would be a suitable library for generating random numbers and doing a hash?

I'm trying to understand the existing security risks for IP protection and how I should support a potential product based on the Teensy 4. I'm thinking that I shouldn't allow .hex files to be publicly available, so firmware updates (if ever needed) are done thru an RMA process, which nobody likes.

If I can do a 'license key' check in hardware that isn't defeatable by modifying the .hex file, then I could consider letting .hex files in the wild.