Sunday, September 6, 2009

Activity 16. Neural Networks

In this activity, objects were classified using neural networks. Only two types and two features were used for this activity. The two types of bottle caps(Coke and Sprite) were used as samples to test the efficiency of neural networks as classifiers. The coke bottle caps were tagged with the value of 1 while the sprite bottle caps were given the value of 0. The training set used to train the neural network is given below. The first column is the area of the object while the second column is the color level of the object.

x=[0.929186603, 0.58557;
0.653588517, 0.147957;
0.908133971, 0.615335;
0.579904306, 0.145402;
0.655502392, 0.164236;
0.956937799, 0.630004;
0.902392344, 0.638101;
0.679425837, 0.14559;
0.650717703, 0.15673;
0.677511962, 0.140274;
0.968421053, 0.618755;
0.914832536, 0.635996;
0.742583732, 0.147524;
0.96937799, 0.62855;
0.923444976, 0.642857;
0.717703349, 0.135958;
0.800956938, 0.157814;
1, 0.613532;
0.983732057, 0.668547;
0.7215311, 0.137754;];

The first column was normalized in order for the maximum area calculated will be equal to 1. Using the provided source code(from Cole Fabros' work), the objects were classified using neural networks in Scilab. The source code is provided below.

rand('seed',0);
N=[2,2,1];

x=[0.929186603, 0.58557;0.653588517,0.147957;0.908133971, 0.615335;0.579904306, 0.145402;0.655502392, 0.164236;0.956937799, 0.630004;0.902392344, 0.638101;0.679425837, 0.14559;0.650717703, 0.15673;0.677511962, 0.140274;0.968421053, 0.618755;0.914832536, 0.635996;0.742583732, 0.147524;0.96937799, 0.62855;0.923444976, 0.642857;0.717703349, 0.135958;0.800956938, 0.157814;1, 0.613532;0.983732057, 0.668547;0.7215311, 0.137754;];

x1=x';
t=[0 1 0 1 1 0 0 1 1 1 0 0 1 0 0 1 1 0 0 1 ];
l=[0.1,0];
W=ann_FF_init(N);

T=1000;

W=ann_FF_Std_online(x1,t,N,W,l,T);

A=ann_FF_run(x1,N,W);

The output values of the program is shown below.

A=[0.0698714 0.9559284 0.0604605 0.9617354 0.9494504 0.0532122 0.0541484

0.9547831 0.952926 0.9568003 0.0555345 0.0539992 0.9485505 0.0529052

0.0518202 0.9552809 0.9369444 0.0551047 0.0441003 0.9543265]

The values in A were then rounded off. The resulting values are the same with the target values t given in the source code above. This means that the objects were successfully classified using neural networks. This result is for N=[2,2,1].

round(A)=[ 0. 1. 0. 1. 1. 0. 0. 1. 1. 1. 0. 0. 1. 0. 0. 1. 1. 0. 0. 1.]

For N=[2,5,1],

A=[ 0.0550852 0.9665592 0.0434460 0.9728303 0.9595599 0.0347047 0.0358142

0.9653303 0.9633175 0.9674942 0.0375346 0.0356348 0.9586883 0.0343439

0.0330257 0.9658597 0.9464930 0.0370670 0.0238807 0.9648412]

round(A)=[ 0. 1. 0. 1. 1. 0. 0. 1. 1. 1. 0. 0. 1. 0. 0. 1. 1. 0. 0. 1.]


If the learning rate was changed to a higher value the output A changes. The result is given below.

A=[ 0.0156586 0.9922242 0.0111422 0.9941406 0.9898376 0.0081610 0.0085014

0.9918369 0.9911437 0.9925391 0.0091049 0.0084479 0.9895962 0.0080486

0.0076092 0.9920324 0.9849933 0.0089606 0.0049144 0.9916999 ]

This result was calculated when the learning rate is 0.9. Comparing the values of A when the learning rate is 0.1 and when the learning rate is 0.9, it can be observed that the values of A are nearer to the target values when the learning rate is 0.9. These means that a higher learning rate produces a more accurate result.

I will give myself 10/10 for getting accurate results.

No comments:

Post a Comment