Last week, I was able to put
mlpack_knn on Raspberry Pi. The binary
size was about 4.7 MB. The size footprint is still considerable for embedded
systems, especially if you have dozens of megabytes for the entire firmware.
In order to reduce the size of the binaries, I had to look inside these elf files
bloaty size profiler tools. The profiler (can be found
here) i) is
simple to use, ii) provides size footprint for each symbols used inside these
Since I am linking statically, we can see how dependencies are shaping the size of the binary. For example, mlpack uses boost serialization to serialize models which composes a considerable amount of the binary size.
In addition, If you are able to cross compile and link statically, you can start an empty example with mlpack main inside. If you compile an empty mlpack main the binary size is about 1.8 MB. Add two lines to import Knn model using program options, and you will size the size go up to 4.3 MB. The cost of using boost serialization is 1.5 MB just to import the model.
As usual, to follow live update on the project, you can see my pull request here
There are Several points I have achieved last week:
- Adding all possible flags to reduce binary size
- Use Bloaty profiler to analyze the size footprints of symbols
- Add marginal support for Epiphany, i686, mips64, xtansa, and RISC V architectures.
- Open an issue here to discuss the removal of boost
As noticed above, the cost of boost serialization is considerable, and this is only one of the issues caused. In fact, for any embedded system project, you need to cross compile all dependencies to create a static binary. Therefore, dependencies increase time needed to build the project for a specific architecture and add more code that downloads, cross-compile and links these dependencies.
For the next week, I will remove two boost dependencies (serialization and program options), and replacing them by header only libraries, this will remove the need of linking with dependencies.
For any suggestions, just leave a comment on my pull request or catch me on IRC.