An ANN is a network of many very simple processors ("units"), each possibly having a (small amount of) local memory. The units are connected by unidirectional communication channels ("connections"), which carry numeric (as opposed to symbolic) data. The units operate only on their local data and on the inputs they receive via the connections.
The design motivation is what distinguishes neural networks from other mathematical techniques: A neural network is a processing device, either an algorithm, or actual hardware, whose design was motivated by the design and functioning of human brains and components thereof.
Most neural networks have some sort of "training" rule whereby the weights of connections are adjusted on the basis of presented patterns. In other words, neural networks "learn" from examples, just like children learn to recognize dogs from examples of dogs, and exhibit some structural capability for generalization. Neural networks normally have great potential for parallelism, since the computations of the components are independent of each other.
For me the most relevant feature of Artificial Neural Networks is that they are made after biological models, so they have similar structures to the human brain. I make pieces of art by Neural Networks. This gives me a new aspect of examining thoughts, especially very simple ones.
The NNM 1.23 always works with three layer network and each neuron is connected to every neuron in the next layer. The program uses the Backpropagation algorythm. It is easy to get information about each neuron.
Use the help command to get online description of commands and their syntax. In the next section I will give some examples as how to do it.
Please send bug reports to janna@osiris.elte.hu
Lets create the network:
>> net 4 2 4
And then we should make the samples like this:
>sample 4
>>0001 0001
>>0010 0010
>>0100 0100
>>1000 1000
4 samples have been defined
Using the info command we can check if everything is OK:
>info net
Number of neurons: 10
Number of input neurons: 4
Number of hidden neurons: 2
Number of output neurons: 4
Number of samples: 4
It seems to be correct! Than let`s strat the learning process (the value is the expected maximal error of network):
>learn 0.01
This is a very simple network, so on a i386 PC it will take only a few seconds. After that we can try the result by
the 'run' command:
>run 0001
Output: 0001
Error: 0.0098789
>net 2 2 1 >sampe 4 >>00 0 >>01 1 >>10 1 >>11 0 > learn 0.01Notice that we have to put the 01 and 10 in a different sample!