Neural Network training - number of observations in X and Y disagree

조회 수: 2 (최근 30일)
Tajwar Choudhury
Tajwar Choudhury 2020년 3월 28일
댓글: Tajwar Choudhury 2020년 3월 31일
I have created a database of a combination of sine waves with random periods and amplitudes and added random noise to each one to create a set of "clean" data and "noisy" data and hope to train a simple feedforward net to denoise the noisy signals. My training data, the noisy signals are in a 301x10000 double array, where each column corresponds to a single wave, 301 being the length of time and 10000 being the number of random signals. The clean signals are in the exact same format, 301x10000 double array. The first clean signal in the array corresponds to the first noisy signal in the training data, i.e. the first noisy signal in the training dataset is a noisy version of the first clean signal
My network structure is simple, an image input layer, 3 fully connected layers with tanh activations and a regression output. I know I need to reshape the data using the reshape function but I'm unsure how - what I don't get is why it says the number of observations in the training data and target data disagree when they're both the same dimension?
Essentially:
trainingData is a 301x10000 double array of noisy sine waves
trainingTargets is a 301x1000 double array of the same sine waves but without the noise
The imagine input layer has dimensions of [1 301]
When feeding into the net using the trainNetwork function, I get number of observations in X and Y disagree
  댓글 수: 1
Adam Danz
Adam Danz 2020년 3월 28일
Could you put together a minimal working example that reproduces the problem?

댓글을 달려면 로그인하십시오.

답변 (1개)

Srivardhan Gadila
Srivardhan Gadila 2020년 3월 31일
Refer to Train Convolutional Neural Network for Regression and check the sizes of XTrain & YTrain to reshape your data accordingly.
The following code might help you:
layers = [imageInputLayer([301 1 1]) fullyConnectedLayer(500) fullyConnectedLayer(301) regressionLayer];
trainData = randn([301 1 1 1000]);
trainLabels = randn([1000 301]);
options = trainingOptions('adam', ...
'InitialLearnRate',0.005, ...
'LearnRateSchedule','piecewise',...
'MaxEpochs',300, ...
'MiniBatchSize',1024, ...
'Verbose',1, ...
'Plots','training-progress');
net = trainNetwork(trainData,trainLabels,layers,options);
  댓글 수: 1
Tajwar Choudhury
Tajwar Choudhury 2020년 3월 31일
Thank you for the response. What I ended up doing which worked is the following:
trainingTargets2 = reshape(trainingTargets, [1 1 size(trainingTargets,1) size(trainingTargets,2)]);
trainingData2 = reshape(trainingData, [size(trainingData,1) 1 1 size(trainingData,2)]);
validationTargets2 = reshape(validationTargets, [1 1 size(validationTargets,1) size(validationTargets,2)]);
validationData2 = reshape(validationData, [size(validationData,1) 1 1 size(validationData,2)]);
Using those as X,Y and X,Y Validation
Where trainingTargets are the generated clean waves, trainingData are the generated clean waves with added noise, and the same for the validation sets

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

제품


릴리스

R2019b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by