10:22 AM

Neural Network

Posted by Bharath Raj A

A neural network is a computational structure inspired by the study of
biological neural processing. There are many different types of neural
networks, from relatively simple to very complex, just as there are many
theories on how biological neural processing works. A layered feed-forward
neural network has layers, or subgroups of processing elements. A layer of
processing elements makes independent computations on data that it receives
and passes the results to another layer. The next layer may in turn make its
independent computations and pass on the results to yet another layer.
Finally, a subgroup of one or more processing elements determines the output.
. Each processing element makes its computation based upon a weighted sum of its
inputs. The first layer is the input layer and the last the output layer. The layers that
are placed between the first and the last layers are the hidden layers. The processing
elements are seen as units that are similar to the neurons in a human brain, and
hence, they are referred to as cells, neuromimes, or artificial neurons. A threshold
function is sometimes used to qualify the output of a neuron in the output layer. Even
though our subject matter deals with artificial neurons, we will simply refer to them as
neurons. Synapses between neurons are referred to as connections, which are
represented by edges of a directed graph in which the nodes are the artificial neurons.

Fig 01: Neural network