Close

Cake? One layer or five?

A project log for Multi-Function Selective Firing Neurons

An actualization of a thought experiment on how a neural network can be better modeled using neurotransmitters.

rollyn01Rollyn01 05/11/2015 at 05:180 Comments

Well, as mentioned before, this project is meant to combine different type of neural networks into one single framework. However, the overall structure will be based on the multilayer perceptron. This topology relies on different layers of neurons that are typically visualized as vertical columns where each neuron in a column is connected to every neuron of the column adjacent to it. The first perceptron only had two layers; one for input, one for output. More layers were added later in order for the network to deal with larger more complex problems (voice analysis and filtering, data mining, etc) with the additional layers being named "hidden" layers.

The amount of layers I have settled on will be three (with 10 neurons each) which will be "hidden" and two layers that will serve as input and output. My project will not be using backpropagation. Any difference in actuation will be feed forward through the network from the receptors (input layer) till it gets to the actuators (output layer) creating a feedback loop. As such, and to still keep it a bit simple, there will be twenty receptors and five actuators. The receptors will only have their activation status changed, allowing their weights to be read by other neurons. Intervals will not be necessary as their will not be what activates the receptors. Meanwhile, the actuators will only have intervals that can be triggered to determine whether the actuator is activated or not. They will not possess any weights themselves as the are the end-point of the chain and only the activation status is important as with the receptors.

Discussions