Neural Networks warning?

조회 수: 3 (최근 30일)
Mohamed Abdelsamie
Mohamed Abdelsamie 2019년 3월 9일
댓글: Walter Roberson 2020년 12월 5일
Hi,
When I train any neural network i get the warning below. It still trains usable networks but I'd like to know what the warning means.
% Warning: 'trainRation' is not a legal parameter.
% > In nntest.param>do_test (line 63)
% In nntest.param (line 6)
% In network/subsasgn>setDivideParam (line 1838)
% In network/subsasgn>network_subsasgn (line 460)
% In network/subsasgn (line 14)
% In NN_Training (line 78)
I'm using the function below to train the networks but I don't know why trainRation is causing the warning.
net = fitnet(current_neuron_count, TRAIN_FCN);
net.divideParam.trainRation = 70/100;
net.divideParam.valRation = 15/100;
net.divideParam.testRation = 15/100;
Thanks

채택된 답변

Walter Roberson
Walter Roberson 2019년 3월 9일
trainRatio, valRatio, testRatio
no final 'n'. Not trainRation, trainRatio
  댓글 수: 3
Walter Roberson
Walter Roberson 2019년 3월 9일
The versions with 'Ration' would have had those commands ignored, leaving you with the default ratios.
Mohamed Abdelsamie
Mohamed Abdelsamie 2019년 3월 9일
Thanks a lot Walter!

댓글을 달려면 로그인하십시오.

추가 답변 (1개)

alsharif taha
alsharif taha 2020년 12월 5일
when i train this network i get errors
please help me
clc
clear
close all
p=[1:10 10:10:100];
t= (p.^2);
net=newff(p,t,[3], {'logsig' 'purelin'});
net.divideParam.trainRatio=1;
net.divideParam.testRatio=0;
net.divideParam.valRatio=0;
net.divideParam.lr=0.01;
net.divideParam.min_grad=1e-20;
net.divideParam.goal=1e-30;
net.divideParam.epochs=300;
net=train(net,p,t);
plot([1:100] .^2,'x')
hold on
plot(round(net(1:100)),'o')
plot(p,t, '*g')
legend('real target', 'output of net', 'training samples', 'location', 'north west')
the error msgs are:
Warning: 'min_grad' is not a legal parameter.
Warning: 'min_grad' is not a legal parameter.
Warning: 'min_grad' is not a legal parameter.
although i defined the epochs to 300 while training continues to reach 1000 epochs
i do not know why ? pls help me
  댓글 수: 1
Walter Roberson
Walter Roberson 2020년 12월 5일
min_grad is for https://www.mathworks.com/help/deeplearning/ref/traingdx.html not for divideParam

댓글을 달려면 로그인하십시오.

제품


릴리스

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by