FAQ
Hi Jim,

Did you get my direct email on friday?

There's a couple of problems here:

1) You have a network of 3 layers, with a single layer in each node. A
neural net works by changing the weights between arrays of nodes. Having a
single node in each layer means you have something like this:

o->o->o

feeding activation into the left, it will flow through to the right, but
when you adjust the weights, you only have 1 pathway to choose from, so
the network won't learn. I've posted a very simple intro to how neural
nets work at:

http://www.g0n.net/nnflex/SimpleNNIntro.pdf

There's also an animated gif at http://www.g0n.net/nnflex/x.gif that might
help to make the structure of the net clear.

2) The second problem is that you are feeding analogue numbers into the net:
[0],[0],
[1/3],[1/9],
[2/3],[4/9],
[9/3],[9/9]

You need to have a number of input units (see point 1), and translate your
data into a binary format that a neural net can understand, like:

[0,0,0,0],[0,0,0,0],
[0,0,0,1],[0,0,0,1],
[0,0,1,0],[0,1,0,0]
[0,1,0,1],[1,0,0,1]

(These are 0->0, 1->1, 2->4 & 3->9)

etc. This assumes you have 4 input & 4 output nodes (and you'll probably
need 4 hidden nodes as well), which means you can represent numbers up to
15.

I'm not sure how well a simple backprop net will generalise x squared, it
may learn the examples you give it, but be unable to square a number it
hasn't seen before, although it might be possible to encode the data in a
way that will allow a certain amount of generalisation. I'll have to have
a think about that one and give it a try.

charles.

use strict;
use AI::NNFlex::momentum;
use AI::NNFlex::Dataset;

# Create the network

my \$network = AI::NNFlex::momentum->new( learningrate=>.1,
bias=>1,
momentum=>0.6,
round=>1);

activationfunction=>"tanh");

activationfunction=>"tanh");

activationfunction=>"linear");

\$network->init();

my \$dataset = AI::NNFlex::Dataset->new([
[0],[0],
[1/3],[1/9],
[2/3],[4/9],
[9/3],[9/9]]);

my \$counter=0;
my \$err = 10;
while (\$err >1.001)
{
\$err = \$dataset->learn(\$network);

print "Epoch \$counter: Error = \$err\n";
\$counter++;

}

foreach (@{\$dataset->run(\$network)})
{
foreach (@\$_){print \$_}
print "\n";

__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com

## Related Discussions

Discussion Overview
 group ai categories perl posted Mar 20, '05 at 12:43p active Mar 20, '05 at 12:43p posts 1 users 1 website perl.org

### 1 user in discussion

Content

People

Support

Translate

site design / logo © 2021 Grokbase