๐ NeuralNetworks
·
163 commits
to master
since this release
- ๐ ๏ธ Fixed:
-
- a2707b8 Crucial
Softmaxissue: when used withACTIVATION__PER_LAYERand notALL_ACTIVATION_FUNCTIONS
- a2707b8 Crucial
-
- 7b52a56 Potential issue with
CATEGORICAL_CROSS_ENTROPY&BINARY_CROSS_ENTROPYwhen youUSE_64_BIT_DOUBLEwithREDUCE_RAM_WEIGHTS_LVL2
- 7b52a56 Potential issue with
- โจ Added:
-
- 64dcb9d FRAM Examples
-
- c572333 Example for SD migration to v3.0.0
-
- dab6f55 Support for NN execution (partially) via external FRAM
- โ๏ธ Improved:
-
- a03b160 Unnecessary
me->i_j++logic
- a03b160 Unnecessary
-
- 26d2f20 Logic related to
intandunsigned int
- 26d2f20 Logic related to
-
- 2912a86 Unnecessary EEPROM-logic effecting sketch size
-
- 6432955 Backpropagation algorithm, cutting flash memory usage by up to 200 bytes.
-
- 9ac2b51 Prioritized "reduced-logic" over performance at
FeedForward_Individual()
- 9ac2b51 Prioritized "reduced-logic" over performance at
โ ๏ธ Changed:-
- d4ce5e0 Optimized SD
load()&save()
- d4ce5e0 Optimized SD
Warning
load() & save() previous implementations (although perfectly working) had significant design flaws, but the 3.0.0 release brings a much-improved versions of them. Note the breaking change! Iโve included a clear migration guide to help easily convert old NN-files to the new format via just a simple sketch. Alternatively, I'm providing limited backwards compatibility through save_old() and load_old(). However, please note that these legacy methods won't receive further updates or improvements over time.