@@ -14,18 +14,24 @@ particularly well suited for low-compexity and low-power classification tasks.
14
14
It can be combined with feature preprocessing, including neural networks to address more complex tasks.
15
15
16
16
## Status
17
- ** Minimally useful**
17
+ ** Minimally useful, on some MicroPython ports **
18
18
19
- - Has been tested on ` armv6m ` (RP2040), ` xtensawin ` (ESP32) and ` x64 ` (Unix port)
20
- - Pre-built modules are available for the most common architectures/devices
19
+ - Tested * working* on ` x64 ` (Unix port) and ` armv7emsp ` (Cortex M4F/M7 / STM32).
20
+ - ** Not working** on ` armv6m ` (Cortex M0 / RP2040). [ Issue] ( https://github.com/emlearn/emlearn-micropython/issues/14 )
21
+ - ** Not working** on ` xtensawin ` (ESP32). [ Issue] ( https://github.com/emlearn/emlearn-micropython/issues/12 )
21
22
22
23
## Features
23
24
24
25
- Classification with [ RandomForest] ( https://en.wikipedia.org/wiki/Random_forest ) /DecisionTree models
25
26
- Classification and on-device learning with [ K-Nearest Neighbors (KNN)] ( https://en.wikipedia.org/wiki/K-nearest_neighbors_algorithm )
27
+ - Classification with Convolutional Neural Network (CNN), using [ TinyMaix] ( https://github.com/sipeed/TinyMaix/ ) library.
28
+ - Fast Fourier Transform (FFT) for feature preprocessing, or general DSP
29
+ - Infinite Impulse Response (IIR) filters for feature preprocessing, or general DSP
30
+ - Clustering using K-means
26
31
- Installable as a MicroPython native module. No rebuild/flashing needed
27
- - Models can be loaded at runtime from a .CSV file in disk/flash
32
+ - Models can be loaded at runtime from a file in disk/flash
28
33
- Highly efficient. Inference times down to 100 microseconds, RAM usage <2 kB, FLASH usage <2 kB
34
+ - Pre-built binaries available for most architectures.
29
35
30
36
## Prerequisites
31
37
0 commit comments