#### mjs513

##### Senior Member+

Been curious about neural networks for a while (still don't have a clue on how they work yet but will one of theses days) and I came across the CMSIS-NN Library that is under CMSIS-5. According the hype:

CMSIS-NN library consists of two parts: NNFunctions and NNSupportFunctions. NNFunctions include the functions that implement popular neural network layer types, such as convolution, depthwise separable convolution, fully-connected (i.e. inner-product), pooling and activation. These functions are used by the application code to implement the neural network inference applications. The kernel APIs are also kept simple, so that it can be easily retargeted for any machine learning framework. NNSupportFunctions include different utility functions, such as data conversion and activation function tables, which are used in NNFunctions. These utility functions can also be used by the application code to construct more complex NN modules, e.g. Long Short-Term Memory (LSTM) or Gated Recurrent Unit (GRU).

So I got curious and decided to see if I can get it working on a Teensy 3.6. So the bottom line up front is that I did manage to get it working on the Teensy 3.6 with one of the provide examples. It does provide output but I have to figure out now if its correct output but can not find what it should be.

Anyway, the first thing you have to do is to update the DSP-math to the latest CMSIS version and I did that following the instructions on the forum. Then I created a CMSIS-NN for the Teensy which has the include files and the source files in one spot. Then I modified the examples only slightly and it worked.

If anyone is interested in the CMSIS-NN library and I can put the consolidated instructions for updating DSP to CMSIS-DSP to v5 and post the library and working example on GITHUB. Probably will do this tomorrow. I will post back when completed.

v/R

Mike

UPDATE: Here is the link to GITHUB: https://github.com/mjs513/CMSIS-NN--Neural-Network--for-Teensy-3.6. I update the readme shortly