Close
0%
0%

uNeural

Fixed point neural network library suitable for MCU use

Similar projects worth following
With all the hype surrounding the internet of things and machine learning, I just figured, hey, why not both?

This library currently supports only fully connected feed-forward networks, however I plan to re-work the internal data structure to support recurrent and convolutional nets as well (or any arbitrary layout).

Link to Source: https://github.com/Jeff-Ciesielski/libuneural

  • Stake in the ground

    Jeff Ciesielski06/09/2016 at 11:53 0 comments

    Just recently, I pushed a changeset that fixed the last of the major outstanding issues preventing usefulness (a memory overrun causing training to fail on some systems), so as of now, the library is more-or-less functional. The provided example (which can be built with `make example`) shows off the current level of functionality, that is:

    • Creating a new network
    • Initializing that network with random weight
    • Training the network against a known dataset

    Granted, the example is a fairly cheesy 3-4-1 network, but in principal, more complex nets should function in a similar fashion.

    I have two major revisions planned for the near future:


    1. Re-ordering the internal datastructure slightly to allow an end consumer of the trained network to consume it with 0 a-priori knowledge of the structure, aside from the number of inputs and outputs.

    Currently, the user must always specify the structure and attach it to the net. In the future, I think that the data structure generator will be a separate component, and the resulting data will be attached to the net, containing all information about neuron type, interconnects, etc. This will allow for end applications that simply load the model and execute it, rather than being tied to specific layouts. (This will be especially important for non feedforward networks that have complex/recurrent interconnectivity between neurons)

    2. Adding Nim Bindings

    As much as I love C, it leaves a lot to be desired from an ease of development standpoint. As I've been working with Nim quite a lot, I've grown to really love it's simple C FFI and exceptional metaprogramming capabilities and performance. As the complexity of networks increases, so does the complexity of the code to generate them. As a result, I think that providing high-level bindings (that are still 100% compatible with the C core) will make creation and training of more complicated networks much easier. As a first proof of concept, I plan to implement a network based on the MNIST dataset to test the performance of the core and to help finalize the API.


    I expect that the top level API will be fairly fluid in the near future until the internal structure calms down, but I hope to reach some sort of stability milestone by Q4 '16.

View project log

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates