10
§
PROC DMNEURL: Approximation to PROC NEURAL
709.889256
0
142.0
4629.0
ARCTAN
83.708054
1
361.0
828.0
710.036390
0
143.0
4628.0
SQUARE
83.724832
1
355.0
834.0
710.075198
0
136.0
4635.0
EXP
83.741611
1
356.0
833.0
710.212159
0
136.0
4635.0
LOGIST
83.691275
1
357.0
832.0
710.822647
0
140.0
4631.0
COS
83.355705
1
340.0
849.0
718.944913
0
143.0
4628.0
GAUSS
83.288591
1
328.0
861.0
719.269965
0
135.0
4636.0
Goodness-of-Fit Criteria (Ordered by SSE, Stage 3)
Run
Activation
SSE
RMSE
Accuracy
6
SIN
709.77863
0.345095
83.791946
2
TANH
709.88926
0.345122
83.758389
3
ARCTAN
710.03639
0.345157
83.708054
1
SQUARE
710.07520
0.345167
83.724832
8
EXP
710.21216
0.345200
83.741611
4
LOGIST
710.82265
0.345348
83.691275
7
COS
718.94491
0.347316
83.355705
5
GAUSS
719.26997
0.347394
83.288591
Now the residuals are computed and components are selected for the last estimation
stage:
Component Selection: SS(y) and R2 (Stage=4)
Comp
Eigval
R-Square
F Value
p-Value
28
1195.710958
0.003997
23.916548
<.0001
27
3456.490592
0.001822
10.919693
0.0010
25
3935.018952
0.001803
10.824185
0.0010
There are no problems with the optimization processes:
------------ Optimization Cycle (Stage=4) --------------
------------ Activation= SQUARE (Stage=4) --------------
NOTE: ABSGCONV convergence criterion satisfied.
SQUARE: Iter=1 Crit=0.05983921: SSE=703.669268 Acc= 83.6913
------------ Activation= TANH (Stage=4) ----------------
NOTE: ABSGCONV convergence criterion satisfied.
Purpose of PROC DMNEURL
§
11
TANH: Iter=5 Crit=0.06015823: SSE=706.476969 Acc= 83.6074
------------ Activation= ARCTAN (Stage=4) --------------
NOTE: ABSGCONV convergence criterion satisfied.
ARCTAN: Iter=3 Crit=0.06013359: SSE=706.212332 Acc= 83.7081
------------ Activation= LOGIST (Stage=4) --------------
NOTE: ABSGCONV convergence criterion satisfied.
LOGIST: Iter=3 Crit=0.06017552: SSE=706.851414 Acc= 83.7919
------------ Activation= GAUSS (Stage=4) ---------------
NOTE: ABSGCONV convergence criterion satisfied.
GAUSS: Iter=4 Crit=0.06032127: SSE=708.571854 Acc= 83.8255
------------ Activation= SIN (Stage=4) -----------------
NOTE: ABSGCONV convergence criterion satisfied.
SIN: Iter=3 Crit=0.06014411: SSE=706.402904 Acc= 83.6745
------------ Activation= COS (Stage=4) -----------------
NOTE: ABSGCONV convergence criterion satisfied.
COS: Iter=4 Crit=0.06007575: SSE=707.016805 Acc= 83.8087
------------ Activation= EXP (Stage=4) -----------------
NOTE: ABSGCONV convergence criterion satisfied.
EXP: Iter=3 Crit=0.05983526: SSE=703.933766 Acc= 83.6074
The accuracy of the result is no longer improved and drops from 83.79 to 83.72, and
also the (1,1) entry was decreased from 365 to 363. This can happen only when
the discretization error becomes too large in relation to the goodness of fit of the
nonlinear model. Perhaps the specification of larger values for MAXCOMP= and
NPOINT= could improve the solution. However, in most applications we would see
this behavior as a sign that no further improvement of the model fit is possible.
Classification Table for CUTOFF = 0.5000
Predicted
Activation
Accuracy
Observed
1
0
SQUARE
83.724832
1
363.0
826.0
702.899794
0
144.0
4627.0
EXP
83.691275
1
361.0
828.0
703.295564
0
144.0
4627.0
ARCTAN
83.775168
1
364.0
825.0
705.243085
0
142.0
4629.0
SIN
83.691275
1
363.0
826.0
705.508160
0
146.0
4625.0
TANH
83.708054
1
362.0
827.0
705.634506
0
144.0
4627.0
LOGIST
83.708054
1
360.0
829.0
705.732595
0
142.0
4629.0
COS
83.842282
1
364.0
825.0
707.292433
0
138.0
4633.0
GAUSS
83.791946
1
362.0
827.0
708.659944
0
139.0
4632.0