Error using +. Matrix dimensions must agree

Hi, I'm trying to create neural network for train my data. But I get error. I though its right but which one should I fix. Here my code
%clc;
close all;
clearvars;
% Multilayer Perceptron (MLP) neural networks.
N = 220; %sum of data
D = 25 ; %dimension
K = 11; %target
h = 500; %sum of neuron
load datatrain1.mat
Train1= cell2mat(Train);
load classtrain1.mat
CTrain1 = cell2mat(TTrain);
b1 = randn(1,h);%bias from input to hidden
W1 = randn(D,h);%weight from input to hidden
b2 = randn(1,K);%bias from hidden to output
W2 = randn(h,K);%weight from hidden to output
%feedfoward
for efoch = 1 : 1000
H = Train1*W1+b1;
Y = H*W2+b2;
end
I get error in variable H, like this Error using +. Matrix dimensions must agree.
Any help for fix my code will must appreciated. Thanks

댓글 수: 2

madhan ravi
madhan ravi 2018년 11월 11일
upload .mat files
Oman Wisni
Oman Wisni 2018년 11월 11일
yes sir, I already attached my file .mat. But I'm not sure it can open in matlab, cause before I ever upload and download, when I download the format of file is different. And when I upload I no get reply.

댓글을 달려면 로그인하십시오.

 채택된 답변

Guillaume
Guillaume 2018년 11월 11일
편집: Guillaume 2018년 11월 11일

1 개 추천

The error message is very clear, you're trying to add two matrices of different size, which is never possible. Since b1 is a 1xh vector, Train1*W1 must be a 1xh vector, and since W1 is a Dxh matrix, Train1 must be a 1xD vector for the product to be 1xh.
Therefore, your train1 is not 1xD. There is nothing we can do about that. Either use a train1 matrix that is 1xD or change your D to reflect the size of train1.
Note that it is never a good idea to hardcode the size of matrices. It is always safer to ask matlab for the actually size:
D = size(train1, 2);
would ensure it always match train1. Of course, train1 must only have one row.
edit: got confused between D and h but see comment below.

댓글 수: 10

Oman Wisni
Oman Wisni 2018년 11월 11일
편집: Oman Wisni 2018년 11월 11일
Yes sir, if I change h or train1. It is impossible cause it value from feature extraction and h is the number of single hidden layer with 500 neuron. b is bias. I get in theory the formula like that.
I confused too. Is there I wrong declarad bias and weight? But the example I get from this forum
Stephen23
Stephen23 2018년 11월 11일
"But the example I get from this forum"
Please give a link to the thread where you got this from.
Oman Wisni
Oman Wisni 2018년 11월 11일
편집: madhan ravi 2018년 11월 11일
Actually, my reasoning above was a bit wrong, at least for versions >R2016b. Since the implicit expansion introduced in R2016b, the sum with work as long as the product train1*W1 has h columns, the number of rows doesn't matter. Since the product is guaranteed to have h columns or fail, the only reason why the sum would fail for you is that you're using a version earlier than R2016b.
In version prior to R2016b you had to use bsxfun for explicit expansion, so:
H = bsxfum(@sum, train1*W1, b1);
Most likely, you'll have to do the same for the next line.
Oman Wisni
Oman Wisni 2018년 11월 11일
Yes, I using R2015a.
What mean bsxfum? And if I run it, the result will same?
Guillaume
Guillaume 2018년 11월 11일
documentation of bsxfun. In this particular case, it means that the input b1 will be replicated to match the height of train1*W1. Yes, the result will be the same as you would have got in later versions with implicit expansion.
Oman Wisni
Oman Wisni 2018년 11월 11일
Oke sir, I will trying. Thank you
Sir, what code is right ?
H = bsxfun(@sum, train1*W1, b1);
or
H = bsxfun(@plus, Train1*W1, b1);
using @sum I get error, but when I using @plus I get result. My question is, if I using @plus there is nothing change in the result? thanks
Guillaume
Guillaume 2018년 11월 12일
Sorry, it should have been @plus indeed.
Oman Wisni
Oman Wisni 2018년 11월 12일
Yes sir. Thanks

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

도움말 센터File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

질문:

2018년 11월 11일

댓글:

2018년 11월 12일

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by