After training data How to test data in classification learner app ??

조회 수: 25 (최근 30일)
Nilima Gautam
Nilima Gautam 2020년 1월 16일
댓글: NN 2020년 12월 7일
Hello Experts
I am working on GSR sensor . I trained the data using classification learner app and neural network but i m unable to test my data .please tell me the process of testing of dataset , for classification i use 4-5 classes and KNN and SVM i used lebeled data for training and unlabeled for testing. please help me
  댓글 수: 2
Mohammad Sami
Mohammad Sami 2020년 1월 23일
You can use cross validation if your dataset is small or holdout validation if your dataset is large. Once you are happy, you can export the final model using the export functionality.

댓글을 달려면 로그인하십시오.

채택된 답변

Kevin Chng
Kevin Chng 2020년 1월 23일
Hi, Nilima Gautam
In classification learner app, the app will split your data either by holdout or kfold depend on your selection. It will split your data into training dataset and validation dataset. When you are training your model in the app, it uses training dataset to train it and later uses validation dataset to test it and reflect the accuracy for you. In other word, the accuracy you got in the app is the accuracy of your model based on validation dataset. When you click on the confusionchart in the apps, you can realize the number is smaller than original dataset (because it is the validation dataset splitted out from your original dataset).
However, after you export your model to workspace as trainedModel (variable), you may predict your model with any sample
signalTemp2 = trainedModel.predictFcn(outSample);
Or before you dump your dataset into classifical learning apps, you may split your dataset out first, this dataset, some people call it as testing dataset. the function is cvpartition. The workflow should be :
%dataset
c = cvpartition(dataset.label,'HoldOut',0.1); %10% use for testing data
triidx = training(c);
testingdata = dataset(~triidx,:);
training_validation_data = dataset(triidx,:);
%Use classification learning apps
%Select training_validation_data to train and validate
%Export Model
%Classify the testing dataset using trainedModel
signalTemp2 = trainedModel.predictFcn(testingdata);
% Perform evaluation yourself. for example,loss, accuracy, confusion matrix...
  댓글 수: 4
Warid Islam
Warid Islam 2020년 6월 20일
Hi @Kevin,
I am having a similar problem. Please find my code below:
function [trainedClassifier, validationAccuracy] = trainClassifier(trainingData)
% [trainedClassifier, validationAccuracy] = trainClassifier(trainingData)
% returns a trained classifier and its accuracy. This code recreates the
% classification model trained in Classification Learner app. Use the
% generated code to automate training the same model with new data, or to
% learn how to programmatically train models.
%
% Input:
% trainingData: a table containing the same predictor and response
% columns as imported into the app.
%
% Output:
% trainedClassifier: a struct containing the trained classifier. The
% struct contains various fields with information about the trained
% classifier.
%
% trainedClassifier.predictFcn: a function to make predictions on new
% data.
%
% validationAccuracy: a double containing the accuracy in percent. In
% the app, the History list displays this overall accuracy score for
% each model.
%
% Use the code to train the model with new data. To retrain your
% classifier, call the function from the command line with your original
% data or new data as the input argument trainingData.
%
% For example, to retrain a classifier trained with the original data set
% T, enter:
% [trainedClassifier, validationAccuracy] = trainClassifier(T)
%
% To make predictions with the returned 'trainedClassifier' on new data T2,
% use
% yfit = trainedClassifier.predictFcn(T2)
%
% T2 must be a table containing at least the same predictor columns as used
% during training. For details, enter:
% trainedClassifier.HowToPredict
% Auto-generated by MATLAB on 17-Jun-2020 20:44:23
% Extract predictors and response
% This code processes the data into the right shape for training the
% model.
inputTable = trainingData;
predictorNames = {'autoc', 'contr', 'corrm', 'corrp', 'cprom', 'cshad', 'dissi', 'energ', 'entro', 'homom', 'homop', 'maxpr', 'sosvh', 'savgh', 'svarh', 'senth', 'dvarh', 'denth', 'inf1h', 'inf2h', 'indnc', 'idmnc'};
predictors = inputTable(:, predictorNames);
response = inputTable.ClassLabel;
isCategoricalPredictor = [false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false];
% Apply a PCA to the predictor matrix.
% Run PCA on numeric predictors only. Categorical predictors are passed through PCA untouched.
isCategoricalPredictorBeforePCA = isCategoricalPredictor;
numericPredictors = predictors(:, ~isCategoricalPredictor);
numericPredictors = table2array(varfun(@double, numericPredictors));
% 'inf' values have to be treated as missing data for PCA.
numericPredictors(isinf(numericPredictors)) = NaN;
numComponentsToKeep = min(size(numericPredictors,2), 2);
[pcaCoefficients, pcaScores, ~, ~, explained, pcaCenters] = pca(...
numericPredictors, ...
'NumComponents', numComponentsToKeep);
predictors = [array2table(pcaScores(:,:)), predictors(:, isCategoricalPredictor)];
isCategoricalPredictor = [false(1,numComponentsToKeep), true(1,sum(isCategoricalPredictor))];
% Train a classifier
% This code specifies all the classifier options and trains the classifier.
classificationSVM = fitcsvm(...
predictors, ...
response, ...
'KernelFunction', 'linear', ...
'PolynomialOrder', [], ...
'KernelScale', 'auto', ...
'BoxConstraint', 1, ...
'Standardize', true, ...
'ClassNames', [1; 2]);
% Create the result struct with predict function
predictorExtractionFcn = @(t) t(:, predictorNames);
pcaTransformationFcn = @(x) [ array2table((table2array(varfun(@double, x(:, ~isCategoricalPredictorBeforePCA))) - pcaCenters) * pcaCoefficients), x(:,isCategoricalPredictorBeforePCA) ];
svmPredictFcn = @(x) predict(classificationSVM, x);
trainedClassifier.predictFcn = @(x) svmPredictFcn(pcaTransformationFcn(predictorExtractionFcn(x)));
% Add additional fields to the result struct
trainedClassifier.RequiredVariables = {'autoc', 'contr', 'corrm', 'corrp', 'cprom', 'cshad', 'denth', 'dissi', 'dvarh', 'energ', 'entro', 'homom', 'homop', 'idmnc', 'indnc', 'inf1h', 'inf2h', 'maxpr', 'savgh', 'senth', 'sosvh', 'svarh'};
trainedClassifier.PCACenters = pcaCenters;
trainedClassifier.PCACoefficients = pcaCoefficients;
trainedClassifier.ClassificationSVM = classificationSVM;
trainedClassifier.About = 'This struct is a trained model exported from Classification Learner R2019a.';
trainedClassifier.HowToPredict = sprintf('To make predictions on a new table, T, use: \n yfit = c.predictFcn(T) \nreplacing ''c'' with the name of the variable that is this struct, e.g. ''trainedModel''. \n \nThe table, T, must contain the variables returned by: \n c.RequiredVariables \nVariable formats (e.g. matrix/vector, datatype) must match the original training data. \nAdditional variables are ignored. \n \nFor more information, see <a href="matlab:helpview(fullfile(docroot, ''stats'', ''stats.map''), ''appclassification_exportmodeltoworkspace'')">How to predict using an exported model</a>.');
% Extract predictors and response
% This code processes the data into the right shape for training the
% model.
inputTable = trainingData;
predictorNames = {'autoc', 'contr', 'corrm', 'corrp', 'cprom', 'cshad', 'dissi', 'energ', 'entro', 'homom', 'homop', 'maxpr', 'sosvh', 'savgh', 'svarh', 'senth', 'dvarh', 'denth', 'inf1h', 'inf2h', 'indnc', 'idmnc'};
predictors = inputTable(:, predictorNames);
response = inputTable.ClassLabel;
isCategoricalPredictor = [false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false, false];
% Perform cross-validation
KFolds = 5;
cvp = cvpartition(response, 'KFold', KFolds);
% Initialize the predictions to the proper sizes
validationPredictions = response;
numObservations = size(predictors, 1);
numClasses = 2;
validationScores = NaN(numObservations, numClasses);
for fold = 1:KFolds
trainingPredictors = predictors(cvp.training(fold), :);
trainingResponse = response(cvp.training(fold), :);
foldIsCategoricalPredictor = isCategoricalPredictor;
% Apply a PCA to the predictor matrix.
% Run PCA on numeric predictors only. Categorical predictors are passed through PCA untouched.
isCategoricalPredictorBeforePCA = foldIsCategoricalPredictor;
numericPredictors = trainingPredictors(:, ~foldIsCategoricalPredictor);
numericPredictors = table2array(varfun(@double, numericPredictors));
% 'inf' values have to be treated as missing data for PCA.
numericPredictors(isinf(numericPredictors)) = NaN;
numComponentsToKeep = min(size(numericPredictors,2), 2);
[pcaCoefficients, pcaScores, ~, ~, explained, pcaCenters] = pca(...
numericPredictors, ...
'NumComponents', numComponentsToKeep);
trainingPredictors = [array2table(pcaScores(:,:)), trainingPredictors(:, foldIsCategoricalPredictor)];
foldIsCategoricalPredictor = [false(1,numComponentsToKeep), true(1,sum(foldIsCategoricalPredictor))];
% Train a classifier
% This code specifies all the classifier options and trains the classifier.
classificationSVM = fitcsvm(...
trainingPredictors, ...
trainingResponse, ...
'KernelFunction', 'linear', ...
'PolynomialOrder', [], ...
'KernelScale', 'auto', ...
'BoxConstraint', 1, ...
'Standardize', true, ...
'ClassNames', [1; 2]);
% Create the result struct with predict function
pcaTransformationFcn = @(x) [ array2table((table2array(varfun(@double, x(:, ~isCategoricalPredictorBeforePCA))) - pcaCenters) * pcaCoefficients), x(:,isCategoricalPredictorBeforePCA) ];
svmPredictFcn = @(x) predict(classificationSVM, x);
validationPredictFcn = @(x) svmPredictFcn(pcaTransformationFcn(x));
% Add additional fields to the result struct
% Compute validation predictions
validationPredictors = predictors(cvp.test(fold), :);
[foldPredictions, foldScores] = validationPredictFcn(validationPredictors);
% Store predictions in the original order
validationPredictions(cvp.test(fold), :) = foldPredictions;
validationScores(cvp.test(fold), :) = foldScores;
end
% Compute validation accuracy
correctPredictions = (validationPredictions == response);
isMissing = isnan(response);
correctPredictions = correctPredictions(~isMissing);
validationAccuracy = sum(correctPredictions)/length(correctPredictions);
However, I get the following error message:
Unable to use a value of type 'cell' as an index.
Error in mlearnapp.internal.model.DatasetSpecification>@(t)t(:,predictorNames) (line 156)
extractPredictorsFromTableFcn = @(t) t(:,predictorNames);
Error in mlearnapp.internal.model.DatasetSpecification>@(x)extractPredictorsFromTableFcn(x) (line 161)
predictorExtractionFcn = @(x) extractPredictorsFromTableFcn(x);
Error in mlearnapp.internal.model.DatasetSpecification>@(x)exportableModel.predictFcn(predictorExtractionFcn(x)) (line 165)
newExportableModel.predictFcn = @(x) exportableModel.predictFcn(predictorExtractionFcn(x));
Any help would be appreciated. Thank you.
NN
NN 2020년 12월 7일
following this query.I too got the same error

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Dimensionality Reduction and Feature Extraction에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by