Close

Stake in the ground

A project log for uNeural

Fixed point neural network library suitable for MCU use

jeff-ciesielskiJeff Ciesielski 06/09/2016 at 11:530 Comments
Just recently, I pushed a changeset that fixed the last of the major outstanding issues preventing usefulness (a memory overrun causing training to fail on some systems), so as of now, the library is more-or-less functional. The provided example (which can be built with `make example`) shows off the current level of functionality, that is:

Granted, the example is a fairly cheesy 3-4-1 network, but in principal, more complex nets should function in a similar fashion.

I have two major revisions planned for the near future:


1. Re-ordering the internal datastructure slightly to allow an end consumer of the trained network to consume it with 0 a-priori knowledge of the structure, aside from the number of inputs and outputs.

Currently, the user must always specify the structure and attach it to the net. In the future, I think that the data structure generator will be a separate component, and the resulting data will be attached to the net, containing all information about neuron type, interconnects, etc. This will allow for end applications that simply load the model and execute it, rather than being tied to specific layouts. (This will be especially important for non feedforward networks that have complex/recurrent interconnectivity between neurons)

2. Adding Nim Bindings

As much as I love C, it leaves a lot to be desired from an ease of development standpoint. As I've been working with Nim quite a lot, I've grown to really love it's simple C FFI and exceptional metaprogramming capabilities and performance. As the complexity of networks increases, so does the complexity of the code to generate them. As a result, I think that providing high-level bindings (that are still 100% compatible with the C core) will make creation and training of more complicated networks much easier. As a first proof of concept, I plan to implement a network based on the MNIST dataset to test the performance of the core and to help finalize the API.


I expect that the top level API will be fairly fluid in the near future until the internal structure calms down, but I hope to reach some sort of stability milestone by Q4 '16.

Discussions