not enough input arguments
이전 댓글 표시
Iam trying to run function but it give me an error in ' d{i} = pdist2(F{i}.', testSample, 'mahalanobis');' not enough input arguments
function [Ps, winnerClass] = classifyKNN_D_Multi(F, testSample, k, NORMALIZE, useL1distance )
% function [Ps, winnerClass] = classifyKNN_D_Multi(F, testSample, k, NORMALIZE, useL1distance);
%
% This function is used for classifying an unknown sample using the kNN
% algorithm, in its multi-class form.
%
% ARGUMENTS:
% - F: an CELL array that contains the feature values for each class. I.e.,
% F{1} is a matrix of size numOfDimensions x numofSamples FOR THE FIRST
% CLASS, etc.
%
% - testSample: the input sample to be classified
% - k: the kNN parameter
% - NORMALIZE: use class priors to weight results
% - useL1distance: use L1 instead of L2 distance
%
% RETURNS:
% - Ps: an array that contains the classification probabilities for each class
% - winnerClass: the label of the winner class
%%error(nargchk(4,5,nargin))
switch nargin
case 4
useL1distance = '1'; % euclidean distance if 4 variables included
case 5
useL1distance = '0'; % mahalanobis distance if 5 variables included
otherwise
disp('error')
end
numOfClasses = length(F);
if (size(testSample, 2)==1)
testSample = testSample';
end
% initilization of distance vectors:
numOfDims = zeros( 1, numOfClasses );
numOfTrainSamples = zeros( 1, numOfClasses );
d = cell(numOfClasses,1);
% d{i} is a vector, whose elements represent the distance of the testing
% sample from all the samples of i-th class
testSample(isnan(testSample)) = 0.0;
for i=1:numOfClasses
[ numOfDims(i), numOfTrainSamples(i) ] = size( F{i} );
d{i} = inf*ones(max(numOfTrainSamples), 1); % we fill it with inf values
F{i}(isnan(F{i})) = 0.0;
end
if (length(testSample)>1)
for i=1:numOfClasses % for each class:
if (numOfTrainSamples(i)>0)
if ( useL1distance == 1 )
% d{i} = sum( abs(repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}'),2); % L1
d{i} = pdist2(F{i}.', testSample, 'euclidean');
else
%[size(repmat(testSample, [numOfTrainSamples(i) 1])) size(F{i}')]
%sum(sum(isnan(F{i}')))
% d{i} = sum( ((repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}').^2 ),2); % L2
d{i} = pdist2(F{i}.', testSample, 'mahalanobis'); %56
end
d{i} = sort(d{i});
d{i}(end+1:max(numOfTrainSamples)) = inf;
else
d{i} = inf;
end
end
else % single dimension (NO SUM required!!!)
for i=1:numOfClasses
if (numOfTrainSamples(i)>0)
d{i} = (abs(repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}')');
d{i} = sort(d{i});
d{i}(end+1:max(numOfTrainSamples)) = inf;
else
d{i} = inf;
end
end
end
kAll = zeros(numOfClasses, 1);
for j=1:k
curArray = zeros(numOfClasses, 1);
for i=1:numOfClasses
curArray(i) = d{i}(kAll(i)+1);
end
[MIN, IMIN] = min(curArray);
kAll(IMIN) = kAll(IMIN) + 1;
end
if ( NORMALIZE == 0 )
Ps = (kAll ./ k);
else
Ps = kAll ./ numOfTrainSamples';
Ps = Ps / sum(Ps);
end
[MAX, IMAX] = max(Ps);
winnerClass = IMAX;
>> classifyKNN_D_Multi
error
Not enough input arguments.
Error in classifyKNN_D_Multi (line 30)
numOfClasses = length(F);
this is the function Iam classifying with variable Feature is 2 class and has same column size Can anyone tell me where the problem is ?
function [label, P, classNames] = ...
fileClassification(wavFileName, kNN, modelFileName)
%
% function [label, P, classNames] = ...
% fileClassification(wavFileName, kNN, modelFileName)
%
% This function demonstrates the classification of an audio segment,
% stored in a wav file.
%
% ARGUMENTS:
% - wavFileName: the path of the wav file to be classified
% - kNN: the k parameter of the kNN algorithm
% - modelFileName: the path of the kNN classification model
%
% RETURNS:
% - label: the label of the winner class
% - P: a vector that contains all estimated probabilities
% for each audio class contained in the model
% - classNames: a cell array that contains the names of the
% audio classes of the classification model
%
% NOTE: This function classifies the WHOLE audio file, i.e., we
% assume that the file contains a homogeneous audio segment.
% For mid-term classification, please use mtFileClassification().
%
% load classification model:
[Features, classNames, MEAN, STD, Statistics, ...
stWin, stStep, mtWin, mtStep] = kNN_model_load(modelFileName);
[x, fs] = audioread(wavFileName); % read wav file
% short-term feature extraction:
stF = stFeatureExtraction(x, fs, stWin, stStep);
mtWinRatio = mtWin / stWin; mtStepRatio = mtStep / stStep;
% mid-term feature statistic calculation:
[mtFeatures] = mtFeatureExtraction(...
stF, mtWinRatio, mtStepRatio, Statistics);
% long term averaging of the mid-term statistics:
mtFeatures = mean(mtFeatures,2);
% kNN classification
[P, label] = classifyKNN_D_Multi(Features, ...
(mtFeatures - MEAN') ./ STD', kNN,1);
답변 (2개)
Walter Roberson
2020년 11월 19일
>> classifyKNN_D_Multi
You are trying to run a function that requires either 4 or 5 inputs, with no inputs at all.
case 4
useL1distance = '1'; % euclidean distance if 4 variables included
case 5
useL1distance = '0'; % mahalanobis distance if 5 variables included
The code sets useL1distance to a character regardless of what was passed in as the fifth parameter. Just the fact that you have five parameters is enough to assign the character '0' to useL1distance
if ( useL1distance == 1 )
It is never going to equal the number 1 because it was assigned the character '0' or the charater '1'
댓글 수: 4
Kamil Kacer
2020년 11월 19일
편집: Kamil Kacer
2020년 11월 19일
Walter Roberson
2020년 11월 20일
function [Ps, winnerClass] = classifyKNN_D_Multi(F, testSample, k, NORMALIZE, distancetype )
% function [Ps, winnerClass] = classifyKNN_D_Multi(F, testSample, k, NORMALIZE, distancetype);
%
% This function is used for classifying an unknown sample using the kNN
% algorithm, in its multi-class form.
%
% ARGUMENTS:
% - F: an CELL array that contains the feature values for each class. I.e.,
% F{1} is a matrix of size numOfDimensions x numofSamples FOR THE FIRST
% CLASS, etc.
%
% - testSample: the input sample to be classified
% - k: the kNN parameter
% - NORMALIZE: use class priors to weight results
% - useL1distance: use L1 instead of L2 distance
%
% RETURNS:
% - Ps: an array that contains the classification probabilities for each class
% - winnerClass: the label of the winner class
%%error(nargchk(4,5,nargin))
switch nargin
case 4
distancetype = 1; % euclidean distance if 4 variables included
case 5
if ~isscalar(distancetype) || ~isnumeric(distancetype) || ~ismember(distancetype, 1:3);
error('invalid distance type');
end
otherwise
disp('error')
end
numOfClasses = length(F);
if (size(testSample, 2)==1)
testSample = testSample';
end
% initilization of distance vectors:
numOfDims = zeros( 1, numOfClasses );
numOfTrainSamples = zeros( 1, numOfClasses );
d = cell(numOfClasses,1);
% d{i} is a vector, whose elements represent the distance of the testing
% sample from all the samples of i-th class
testSample(isnan(testSample)) = 0.0;
for i=1:numOfClasses
[ numOfDims(i), numOfTrainSamples(i) ] = size( F{i} );
d{i} = inf*ones(max(numOfTrainSamples), 1); % we fill it with inf values
F{i}(isnan(F{i})) = 0.0;
end
if (length(testSample)>1)
for i=1:numOfClasses % for each class:
if (numOfTrainSamples(i)>0)
if ( distancetype == 1 )
% d{i} = sum( abs(repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}'),2); % L1
d{i} = pdist2(F{i}.', testSample, 'euclidean');
elseif ( distancetype == 2)
d{i} = pdist2{F{i}.', testSample, 'cityblock'); %L1
else
d{i} = pdist2(F{i}.', testSample, 'mahalanobis'); %56
end
d{i} = sort(d{i});
d{i}(end+1:max(numOfTrainSamples)) = inf;
else
d{i} = inf;
end
end
else % single dimension (NO SUM required!!!)
for i=1:numOfClasses
if (numOfTrainSamples(i)>0)
d{i} = (abs(repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}')');
d{i} = sort(d{i});
d{i}(end+1:max(numOfTrainSamples)) = inf;
else
d{i} = inf;
end
end
end
kAll = zeros(numOfClasses, 1);
for j=1:k
curArray = zeros(numOfClasses, 1);
for i=1:numOfClasses
curArray(i) = d{i}(kAll(i)+1);
end
[MIN, IMIN] = min(curArray);
kAll(IMIN) = kAll(IMIN) + 1;
end
if ( NORMALIZE == 0 )
Ps = (kAll ./ k);
else
Ps = kAll ./ numOfTrainSamples';
Ps = Ps / sum(Ps);
end
[MAX, IMAX] = max(Ps);
winnerClass = IMAX;
Now, to use Euclidean distance use
[P, label] = classifyKNN_D_Multi(Features, ...
(mtFeatures - MEAN') ./ STD', kNN,1);
or
[P, label] = classifyKNN_D_Multi(Features, ...
(mtFeatures - MEAN') ./ STD', kNN,1,1);
This is a change from before. Now the final 1 is for distance type #1, Euclidean.
To use L1 distance use distance type #2, L1
[P, label] = classifyKNN_D_Multi(Features, ...
(mtFeatures - MEAN') ./ STD', kNN, 1, 2);
To use mahalonobis use distance type #3
[P, label] = classifyKNN_D_Multi(Features, ...
(mtFeatures - MEAN') ./ STD', kNN, 1, 3);
Kamil Kacer
2020년 11월 20일
Walter Roberson
2020년 11월 20일
'cityblock' is another name for L1 norm.
Can i make it work li this ?
Why? If you are going to use a numeric parameter, then make it mean something.
When you go to write a GUI to allow the user to choose which distance type they want, are you going to want to code
if handles.distance_type_button.Value == 0
[P, label] = classifyKNN_D_Multi(Features, ...
(mtFeatures - MEAN') ./ STD', kNN, 1);
else
[P, label] = classifyKNN_D_Multi(Features, ...
(mtFeatures - MEAN') ./ STD', kNN, 1, 1);
end
or are you going to want to have a single call to the function with a parameter that indicates which distance type the user wanted?
Kamil Kacer
2020년 11월 20일
댓글 수: 10
Walter Roberson
2020년 11월 20일
"cityblock distance" is another name for L1 norm, which is the distance measure your code used to use, the inspiration for the variable name "useL1distance" . It is also called "taxicab distance" https://en.wikipedia.org/wiki/Taxicab_geometry
There is no point in deliberately removing it -- you might want it later for further tests.
why is this orring everything
This is a common code pattern when you are trying to ensure that a variable has some specific form. Another equivalent way of writing the same code would be
if ~(isscalar(distancetype) && isnumeric(distancetype) && ismember(distancetype, 1:3))
This is mathematically equivalent: ((~A) or (~B)) is the same as ~(A and B) . Which form you write becomes a matter of style. The ~A || ~B form is easier to add more cases to.
The tests we are doing here are:
- distancetype must be a scalar
- distancetype must be numeric
- distance type must be one of 1, 2, or 3.
Kamil Kacer
2020년 11월 20일
Walter Roberson
2020년 11월 20일
function [Ps, winnerClass] = classifyKNN_D_Multi(F, testSample, k, NORMALIZE, distancetype )
% function [Ps, winnerClass] = classifyKNN_D_Multi(F, testSample, k, NORMALIZE, distancetype);
%
% This function is used for classifying an unknown sample using the kNN
% algorithm, in its multi-class form.
%
% ARGUMENTS:
% - F: an CELL array that contains the feature values for each class. I.e.,
% F{1} is a matrix of size numOfDimensions x numofSamples FOR THE FIRST
% CLASS, etc.
%
% - testSample: the input sample to be classified
% - k: the kNN parameter
% - NORMALIZE: use class priors to weight results
% - useL1distance: use L1 instead of L2 distance
%
% RETURNS:
% - Ps: an array that contains the classification probabilities for each class
% - winnerClass: the label of the winner class
%%error(nargchk(4,5,nargin))
switch nargin
case 4
distancetype = 1; % euclidean distance if 4 variables included
case 5
distancetype = 3;
otherwise
disp('error')
end
numOfClasses = length(F);
if (size(testSample, 2)==1)
testSample = testSample';
end
% initilization of distance vectors:
numOfDims = zeros( 1, numOfClasses );
numOfTrainSamples = zeros( 1, numOfClasses );
d = cell(numOfClasses,1);
% d{i} is a vector, whose elements represent the distance of the testing
% sample from all the samples of i-th class
testSample(isnan(testSample)) = 0.0;
for i=1:numOfClasses
[ numOfDims(i), numOfTrainSamples(i) ] = size( F{i} );
d{i} = inf*ones(max(numOfTrainSamples), 1); % we fill it with inf values
F{i}(isnan(F{i})) = 0.0;
end
if (length(testSample)>1)
for i=1:numOfClasses % for each class:
if (numOfTrainSamples(i)>0)
if ( distancetype == 1 )
% d{i} = sum( abs(repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}'),2); % L1
d{i} = pdist2(F{i}.', testSample, 'euclidean');
elseif ( distancetype == 2)
d{i} = pdist2{F{i}.', testSample, 'cityblock'); %L1
else
d{i} = pdist2(F{i}.', testSample, 'mahalanobis'); %56
end
d{i} = sort(d{i});
d{i}(end+1:max(numOfTrainSamples)) = inf;
else
d{i} = inf;
end
end
else % single dimension (NO SUM required!!!)
for i=1:numOfClasses
if (numOfTrainSamples(i)>0)
d{i} = (abs(repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}')');
d{i} = sort(d{i});
d{i}(end+1:max(numOfTrainSamples)) = inf;
else
d{i} = inf;
end
end
end
kAll = zeros(numOfClasses, 1);
for j=1:k
curArray = zeros(numOfClasses, 1);
for i=1:numOfClasses
curArray(i) = d{i}(kAll(i)+1);
end
[MIN, IMIN] = min(curArray);
kAll(IMIN) = kAll(IMIN) + 1;
end
if ( NORMALIZE == 0 )
Ps = (kAll ./ k);
else
Ps = kAll ./ numOfTrainSamples';
Ps = Ps / sum(Ps);
end
[MAX, IMAX] = max(Ps);
winnerClass = IMAX;
Kamil Kacer
2020년 11월 20일
Walter Roberson
2020년 11월 21일
function [Ps, winnerClass] = classifyKNN_D_Multi(F, testSample, k, NORMALIZE, distancetype )
% function [Ps, winnerClass] = classifyKNN_D_Multi(F, testSample, k, NORMALIZE, distancetype);
%
% This function is used for classifying an unknown sample using the kNN
% algorithm, in its multi-class form.
%
% ARGUMENTS:
% - F: an CELL array that contains the feature values for each class. I.e.,
% F{1} is a matrix of size numOfDimensions x numofSamples FOR THE FIRST
% CLASS, etc.
%
% - testSample: the input sample to be classified
% - k: the kNN parameter
% - NORMALIZE: use class priors to weight results
% - useL1distance: use L1 instead of L2 distance
%
% RETURNS:
% - Ps: an array that contains the classification probabilities for each class
% - winnerClass: the label of the winner class
%%error(nargchk(4,5,nargin))
switch nargin
case 4
distancetype = 1; % euclidean distance if 4 variables included
case 5
distancetype = 3;
otherwise
disp('error')
end
numOfClasses = length(F);
if (size(testSample, 2)==1)
testSample = testSample';
end
% initilization of distance vectors:
numOfDims = zeros( 1, numOfClasses );
numOfTrainSamples = zeros( 1, numOfClasses );
d = cell(numOfClasses,1);
% d{i} is a vector, whose elements represent the distance of the testing
% sample from all the samples of i-th class
testSample(isnan(testSample)) = 0.0;
for i=1:numOfClasses
[ numOfDims(i), numOfTrainSamples(i) ] = size( F{i} );
d{i} = inf*ones(max(numOfTrainSamples), 1); % we fill it with inf values
F{i}(isnan(F{i})) = 0.0;
end
if (length(testSample)>1)
for i=1:numOfClasses % for each class:
if (numOfTrainSamples(i)>0)
if ( distancetype == 1 )
% d{i} = sum( abs(repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}'),2); % L1
d{i} = pdist2(F{i}.', testSample, 'euclidean');
elseif ( distancetype == 2)
d{i} = pdist2(F{i}.', testSample, 'cityblock'); %L1
else
d{i} = pdist2(F{i}.', testSample, 'mahalanobis'); %56
end
d{i} = sort(d{i});
d{i}(end+1:max(numOfTrainSamples)) = inf;
else
d{i} = inf;
end
end
else % single dimension (NO SUM required!!!)
for i=1:numOfClasses
if (numOfTrainSamples(i)>0)
d{i} = (abs(repmat(testSample, [numOfTrainSamples(i) 1]) - F{i}')');
d{i} = sort(d{i});
d{i}(end+1:max(numOfTrainSamples)) = inf;
else
d{i} = inf;
end
end
end
kAll = zeros(numOfClasses, 1);
for j=1:k
curArray = zeros(numOfClasses, 1);
for i=1:numOfClasses
curArray(i) = d{i}(kAll(i)+1);
end
[MIN, IMIN] = min(curArray);
kAll(IMIN) = kAll(IMIN) + 1;
end
if ( NORMALIZE == 0 )
Ps = (kAll ./ k);
else
Ps = kAll ./ numOfTrainSamples';
Ps = Ps / sum(Ps);
end
[MAX, IMAX] = max(Ps);
winnerClass = IMAX;
Kamil Kacer
2020년 11월 21일
Walter Roberson
2020년 11월 21일
Are you still using
[P, label] = classifyKNN_D_Multi(Features, ...
(mtFeatures - MEAN') ./ STD', kNN,1);
Kamil Kacer
2020년 11월 21일
Walter Roberson
2020년 11월 21일
You told me before that you were passing in only 4 arguments but were still having mahalonobis being invoked, so that is the problem that I am trying to debug. This code shows you passing in 5 arguments. You specifically asked that passing in 5 arguments should always mean mahalonobis. I argued against that but you insisted so I changed it back for you.
Kamil Kacer
2020년 11월 21일
카테고리
도움말 센터 및 File Exchange에서 Code Generation에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!