Items tagged with neural-network

when test most simple case one to one, and many to one these two reasonable cases, it run a very long time without exit the program.

when i test with the example in book Neural Network Design, it can output correctly but only for book example.

restart:
with(ExcelTools):
with(ListTools):
with(plots):
with(LinearAlgebra):
zipplus := proc(mm, pp)
return zip((x,y) -> x+y, mm, pp)
end proc:
zipminus := proc(mm, pp)
return zip((x,y) -> x-y, mm, pp)
end proc:
zipstar := proc(mm, pp)
return zip((x,y) -> x*y, mm, pp)
end proc:

metara := proc(pp,meaght,bb,mapp,deep)
if deep > 0 then
 pp2 := metara([seq(pp[i], i = 1 .. nops(pp))],meaght,bb,mapp,deep-1):
 mp := zip((x,y) -> x*y,pp2,meaght):
 mpsam := sum(mp[m],m=1..nops(pp2)):
 mpb := [seq(0, i = 1 .. nops(pp2))]:
 for ii from 1 to nops(bb) do
  mpb[ii] := evalf(mpsam + bb[ii]);
 od:
 pa := [seq(0, i = 1 .. nops(pp2))]:
 for ii from 1 to nops(bb) do
  pa[ii] := evalf(mapp[deep](mpb[ii])):
 od:
 return pa:
else
 return pp:
end if:
end proc:

perceptronrule1 := proc(p, t1, meaght1, b1, checksum, mapp)
 meaght3 := meaght1:
 b3 := b1:
 checksum2 := checksum;
 print(p[1]):
 while sum(checksum2[jj], jj=1..nops(p)) <> nops(p) do
  #print("sum(checksum2[jj], jj=1..nops(p))");
  #print(sum(checksum2[jj], jj=1..nops(p)));
  for ii from 1 to nops(p) do
   #print("metara(p[ii],meaght3,b3,mapp,1)");
   #print("p[ii]");
   #print(p[ii]);
   #print("b3");
   #print(b3);
   #print("meaght3");
   #print(meaght3);
   #print(metara(p[ii],meaght3,b3,mapp,1));
   #print("t1[ii]");
   #print(t1[ii]);
   e := zipminus(t1[ii], metara(p[ii],meaght3,b3,mapp,1));
   #print("e");
   #print(e);
   meaght2 := meaght3 + zipstar(e,p[ii]);
   #print("meaght2");
   #print(meaght2);
   #print("meaght3");
   #print(meaght3);
   #print("b3");
   #print(b3);
   b2 := b3 + e;
   #print("b2");
   #print(b2);
   #print("b3");
   #print(b3);
   #print("checksum2");
   #print(checksum2);
   diff1 := zipminus(meaght2, meaght3):
   diff2 := zipminus(b2, b3):
   if sum(diff1[m],m=1..nops(diff1)) = 0 and sum(diff2[m],m=1..nops(diff2)) = 0 then
    checksum2[ii] := 1:
   else
    checksum2[ii] := 0:
    b3 := b2:
    meaght3 := meaght2:
   end if:
  od:
 od:
 return [meaght3, b3, checksum];
end proc:

#Example from book
ppp := [[2,2],[1,-2],[-2,2],[-1,1]]:
check1 := [seq(0,ii=1..nops(ppp))];
ttt1 := [[0,0],[1,1],[0,0],[1,1]]:
mmmeaght1 := [seq(0,ii=1..nops(ppp[1]))]:
bbb1 := [seq(0,ii=1..nops(ppp[1]))]:
emap := [(x) -> if x < 0 then 0 else 1 end if, (x) -> evalf(1/(1+exp(x)))]:
#trace(perceptronrule1);
perceptronrule1(ppp,ttt1,mmmeaght1,bbb1,check1,emap);

#my testing for one to one
#after testing, it loop a very long time and not stop
ppp := [[0,0,0,1],[0,0,1,0],[0,1,0,0],[1,0,0,0]]:
check1 := [seq(0,ii=1..nops(ppp))];
ttt1 := [[0,0,0,1],[0,0,1,0],[0,1,0,0],[1,0,0,0]]:
mmmeaght1 := [0,0,0,0]:
bbb1 := [0,0,0,0]:
emap := [(x) -> if x < 0 then 0 else 1 end if, (x) -> evalf(1/(1+exp(x)))]:
#trace(perceptronrule1);
perceptronrule1(ppp,ttt1,mmmeaght1,bbb1,check1,emap);

#my testing for many to one
#after testing, it loop a very long time and not stop
ppp := [[1,1,0,0],[0,1,1,0],[0,1,0,1]]:
check1 := [seq(0,ii=1..nops(ppp))];
ttt1 := [[1,0,0,0],[0,1,0,0],[0,0,0,1]]:
mmmeaght1 := [0,0,0,0]:
bbb1 := [0,0,0,0]:
emap := [(x) -> if x < 0 then 0 else 1 end if, (x) -> evalf(1/(1+exp(x)))]:
#trace(perceptronrule1);
perceptronrule1(ppp,ttt1,mmmeaght1,bbb1,check1,emap);

#my testing for one to many
#after testing, it loop a very long time and not stop
ppp := [[1,0,0,0],[0,1,0,0],[0,0,0,1]]:
check1 := [seq(0,ii=1..nops(ppp))];
ttt1 := [[1,1,0,0],[0,1,1,0],[0,1,0,1]]:
mmmeaght1 := [0,0,0,0]:
bbb1 := [0,0,0,0]:
emap := [(x) -> if x < 0 then 0 else 1 end if, (x) -> evalf(1/(1+exp(x)))]:
#trace(perceptronrule1);
perceptronrule1(ppp,ttt1,mmmeaght1,bbb1,check1,emap);

http://www.maplesoft.com/applications/view.aspx?SID=4229

from book example, it seems assumed that input size of data such as list size or matrix size is the same as

trained data set size, but this need to hard code infinite number of types of size

What is the method to programming neural network when input size is smaller or changing and not equal to size of trained data set?

in internet or maple, is there a neural network package in maple?

i find example link after googled, however, still feeling not easy to apply to data.

is there the most simplest version for two columns of data ?

Page 1 of 1