Neural Network Manager - NNM 1.23

About Neural Networks.

First of all I`d like to cite some parapgraphs from the FAQ file of AI news group:

An ANN is a network of many very simple processors ("units"), each possibly having a (small amount of) local memory. The units are connected by unidirectional communication channels ("connections"), which carry numeric (as opposed to symbolic) data. The units operate only on their local data and on the inputs they receive via the connections.

The design motivation is what distinguishes neural networks from other mathematical techniques: A neural network is a processing device, either an algorithm, or actual hardware, whose design was motivated by the design and functioning of human brains and components thereof.

Most neural networks have some sort of "training" rule whereby the weights of connections are adjusted on the basis of presented patterns. In other words, neural networks "learn" from examples, just like children learn to recognize dogs from examples of dogs, and exhibit some structural capability for generalization. Neural networks normally have great potential for parallelism, since the computations of the components are independent of each other.

For me the most relevant feature of Artificial Neural Networks is that they are made after biological models, so they have similar structures to the human brain. I make pieces of art by Neural Networks. This gives me a new aspect of examining thoughts, especially very simple ones.

About NNM 1.23.

This pogram has been designed for you to play a little with Neural Networks. You can create simple nets, give them collections of samples and you can follow the way it learns. It is possible to save networks to disk with samples and to reload them.

The NNM 1.23 always works with three layer network and each neuron is connected to every neuron in the next layer. The program uses the Backpropagation algorythm. It is easy to get information about each neuron.

Use the help command to get online description of commands and their syntax. In the next section I will give some examples as how to do it.

Please send bug reports to janna@osiris.elte.hu

IMPORTANT:
IT IS A FREE PROGRAM, YOU CAN COPY AND USE IT,
BUT ONLY ON YOUR ON RISK
THERE IS NO ANY GARANTEE ON THIS PROGRAM,
AND ANY DATA PROCESSED BY THE PROGRAM.

Examples:

A simple test

This is a simple network to test our system. There are four neurons in the Input and output layer and two neurons in the hidden layer. The aim of this network is only to compress the four input values into the two hidden neurons and then recreate the original values into output neurons.

Lets create the network:


>> net 4 2 4
And then we should make the samples like this:

>sample 4
>>0001 0001
>>0010 0010
>>0100 0100
>>1000 1000

  4 samples have been defined
Using the info command we can check if everything is OK:

>info net

  Number of neurons: 10
  Number of input neurons: 4
  Number of hidden neurons: 2
  Number of output neurons: 4
  Number of samples: 4
It seems to be correct! Than let`s strat the learning process (the value is the expected maximal error of network):

>learn 0.01
This is a very simple network, so on a i386 PC it will take only a few seconds. After that we can try the result by the 'run' command:

>run 0001

  Output: 0001
  Error: 0.0098789

Xor gate:

To teach some neurons to behave like a xor gate is also very easy:

>net 2 2 1
>sampe 4
>>00 0
>>01 1
>>10 1
>>11 0
> learn 0.01
Notice that we have to put the 01 and 10 in a different sample!