Network class hidden/undocumented properties

조회 수: 3 (최근 30일)
Roberto
Roberto 2016년 4월 28일
댓글: Walter Roberson 2016년 4월 29일
I have 2 network objects. net1 is defined by calling "fitnet", net2 by calling "network". Then net2 is set to be identical to net1, and it actually is.
If i call "isequal" on every single visible property (they are 37) of them I get true but if I call isequal(net1,net2) I get false.
How is that possibile? Are there any hidden/undocumented properties in network class? If yes, how can I retrieve them?
Here's some code
Define networks
net2 = distdelaynet({[1,2,3],[0,1],[0,1]},[10,5]);
net1 = network(1,3,[true;true;true],[true;false;false],[false,false,false;true,false,false;false,true,false],[false,false,true]);
net1.inputs{1,1}.processFcns = {'removeconstantrows','mapminmax'};
net1.outputs{1,3}.processFcns = {'removeconstantrows','mapminmax'};
net1.layers{1,1}.name = 'Hidden 1';
net1.layers{1,1}.size = 10;
net1.layers{1,1}.transferFcn = 'tansig';
net1.layers{1,1}.initFcn = 'initnw';
net1.layers{2,1}.name = 'Hidden 2';
net1.layers{2,1}.size = 5;
net1.layers{2,1}.transferFcn = 'tansig';
net1.layers{2,1}.initFcn = 'initnw';
net1.layers{3,1}.name = 'Output';
net1.layers{3,1}.initFcn = 'initnw';
net1.inputWeights{1,1}.delays = [1,2,3];
net1.inputWeights{1,1}.learnFcn = 'learngdm';
net1.inputWeights{1,1}.learnParam = nnetParam('learngdm');
net1.inputWeights{1,1}.initFcn = '';
net1.layerWeights{1,1}.learnFcn = 'learngdm';
net1.layerWeights{1,1}.initFcn = '';
net1.layerWeights{2,1}.delays = [0,1];
net1.layerWeights{2,1}.learnFcn = 'learngdm';
net1.layerWeights{2,1}.initFcn = '';
net1.layerWeights{2,2}.learnFcn = 'learngdm';
net1.layerWeights{2,2}.initFcn = '';
net1.layerWeights{3,2}.delays = [0,1];
net1.layerWeights{3,2}.learnFcn = 'learngdm';
net1.layerWeights{3,2}.initFcn = '';
net1.layerWeights{3,3}.learnFcn = 'learngdm';
net1.layerWeights{3,3}.initFcn = '';
net1.biases{1,1}.learnFcn = 'learngdm';
net1.biases{2,1}.learnFcn = 'learngdm';
net1.biases{3,1}.learnFcn = 'learngdm';
net2.name = net1.name;
net1.adaptFcn = 'adaptwb';
net1.divideFcn = net2.divideFcn;
net1.divideParam = net2.divideParam;
net1.divideMode = net2.divideMode;
net1.plotFcns = net2.plotFcns;
net1.plotParams = net2.plotParams;
net1.trainFcn = net2.trainFcn;
net1.trainParam = net2.trainParam;
net1.LW = net2.LW;
net1.b = net2.b;
Then checking every property
isequal(net1.name,net2.name)
isequal(net1.userdata,net2.userdata)
isequal(net1.numInputs,net2.numInputs)
isequal(net1.numLayers,net2.numLayers)
isequal(net1.numOutputs,net2.numOutputs)
isequal(net1.numInputDelays,net2.numInputDelays)
isequal(net1.numLayerDelays,net2.numLayerDelays)
isequal(net1.numFeedbackDelays,net2.numFeedbackDelays)
isequal(net1.numWeightElements,net2.numWeightElements)
isequal(net1.sampleTime,net2.sampleTime)
isequal(net1.biasConnect,net2.biasConnect)
isequal(net1.inputConnect,net2.inputConnect)
isequal(net1.layerConnect,net2.layerConnect)
isequal(net1.outputConnect,net2.outputConnect)
isequal(net1.output,net2.output)
isequal(net1.inputs,net2.inputs)
isequal(net1.layers,net2.layers)
isequal(net1.outputs,net2.outputs)
isequal(net1.biases,net2.biases)
isequal(net1.inputWeights,net2.inputWeights)
isequal(net1.layerWeights,net2.layerWeights)
isequal(net1.adaptFcn,net2.adaptFcn)
isequal(net1.adaptParam,net2.adaptParam)
isequal(net1.derivFcn,net2.derivFcn)
isequal(net1.divideFcn,net2.divideFcn)
isequal(net1.divideParam,net2.divideParam)
isequal(net1.divideMode,net2.divideMode)
isequal(net1.initFcn,net2.initFcn)
isequal(net1.performFcn,net2.performFcn)
isequal(net1.performParam,net2.performParam)
isequal(net1.plotFcns,net2.plotFcns)
isequal(net1.plotParams,net2.plotParams)
isequal(net1.trainFcn,net2.trainFcn)
isequal(net1.trainParam,net2.trainParam)
isequal(net1.IW,net2.IW)
isequal(net1.LW,net2.LW)
isequal(net1.b,net2.b)
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
ans =
1
They look identical, but...
isequal(net1,net2)
ans =
0
Thank you
  댓글 수: 6
Greg Heath
Greg Heath 2016년 4월 29일
It looks like the problem is version dependent:
Using 2014a
>> net1 = fitnet; net2 = fitnet; isequal( net1, net2 )
ans = 1
>> net2 = net1; isequal( net1, net2 )
ans = 1
Hope this helps.
Greg
Roberto
Roberto 2016년 4월 29일
Thank you Greg. I'm using 2014a and I have exactly your result, but the problem is "reproducing" the output of fitnet() by calling network() and then setting the net properly. In this case I have 2 network objects that are not equal despite every property of them are identical. (in the sense of isequal())

댓글을 달려면 로그인하십시오.

답변 (1개)

Walter Roberson
Walter Roberson 2016년 4월 28일
The following properties differ:
'inputWeights' 'divideParam' 'plotParams' 'trainParam' 'revert'
The revert property is outright different. The other four are more difficult to explain. If you use
s1 = struct(net1);
s2 = struct(net2);
then even through net1.divideParam and net2.divideParam appear identical, s1.divideParam will be class nnetParam but s2.divideParam will be a struct with the same essential content. Likewise for plotParams and trainParam. For inputWeights, the difference is in inputWeights{1}.learnparam . I have no explanation for this difference. I poked into the code a bit but nothing was obvious.
The revert property is actually a method: see http://www.mathworks.com/help/nnet/ref/revert.html . None the less, there has to be something there that causes this to be different. Ah, as well as being a method, revert is a hidden property.
The initial divideParam, plotParams, trainParam for net1 are represented with a struct with no field, and if you struct(net1) after the first assignment to it, you will get struct for those fields, just like you do afterwards with net2. This suggests that there is some kind of built-in behavior that struct() should convert those parameters to structures, but that the behavior is getting overwritten by the assignment of the net2 parameters to net1 .
I think I am about out of time to follow this any further.
  댓글 수: 6
Roberto
Roberto 2016년 4월 29일
I'm out of options
Walter Roberson
Walter Roberson 2016년 4월 29일
Sorry, the details of class construction is something I have never investigated before.
It appears that it is built on old style classes.

댓글을 달려면 로그인하십시오.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by