I trying to do elm analysis in MATLAB, but my code is not running properly. I want to use extreme learning machine and want to see regression values for both test and train data. Please help me.
조회 수: 2 (최근 30일)
이전 댓글 표시
function [TrainingTime, TestingTime, TrainingAccuracy, TestingAccuracy] = elm(Train, Test, NumberofHiddenNeurons, ActivationFunction) ActivationFunction='sig'; NumberofHiddenNeurons=1000; % Usage: elm(TrainingData_File, TestingData_File, Elm_Type, NumberofHiddenNeurons, ActivationFunction) % OR: [TrainingTime, TestingTime, TrainingAccuracy, TestingAccuracy] = elm(TrainingData_File, TestingData_File, Elm_Type, NumberofHiddenNeurons, ActivationFunction) % % Input: % TrainingData_File - Filename of training data set % TestingData_File - Filename of testing data set % Elm_Type - 0 for regression; 1 for (both binary and multi-classes) classification % NumberofHiddenNeurons - Number of hidden neurons assigned to the ELM % ActivationFunction - Type of activation function: % 'sig' for Sigmoidal function % 'sin' for Sine function % 'hardlim' for Hardlim function % 'tribas' for Triangular basis function % 'radbas' for Radial basis function (for additive type of SLFNs instead of RBF type of SLFNs) % % Output: % TrainingTime - Time (seconds) spent on training ELM % TestingTime - Time (seconds) spent on predicting ALL testing data % TrainingAccuracy - Training accuracy: % RMSE for regression or correct classification rate for classification % TestingAccuracy - Testing accuracy: % RMSE for regression or correct classification rate for classification % % MULTI-CLASSE CLASSIFICATION: NUMBER OF OUTPUT NEURONS WILL BE AUTOMATICALLY SET EQUAL TO NUMBER OF CLASSES % FOR EXAMPLE, if there are 7 classes in all, there will have 7 output % neurons; neuron 5 has the highest output means input belongs to 5-th class % % Sample1 regression: [TrainingTime, TestingTime, TrainingAccuracy, TestingAccuracy] = elm('sinc_train', 'sinc_test', 0, 20, 'sig') % Sample2 classification: elm('diabetes_train', 'diabetes_test', 1, 20, 'sig') % %%%% Authors: MR QIN-YU ZHU AND DR GUANG-BIN HUANG %%%% NANYANG TECHNOLOGICAL UNIVERSITY, SINGAPORE %%%% EMAIL: EGBHUANG@NTU.EDU.SG; GBHUANG@IEEE.ORG %%%% WEBSITE: http://www.ntu.edu.sg/eee/icis/cv/egbhuang.htm %%%% DATE: APRIL 2004
%%%%%%%%%%% Macro definition
%%%%%%%%%%% Load training dataset train_data=load('Train.txt'); T=train_data(:,1)'; P=train_data(:,2:size(train_data,2))'; clear train_data; % Release raw training data array
%%%%%%%%%%% Load testing dataset test_data=load('test.txt'); TV.T=test_data(:,1)'; TV.P=test_data(:,2:size(test_data,2))'; clear test_data; % Release raw testing data array
NumberofTrainingData=size(P,2); NumberofTestingData=size(TV.P,2); NumberofInputNeurons=size(P,1);
%%%%%%%%%%%%Preprocessing the data of classification
sorted_target=sort(cat(2,T,TV.T),2);
label=zeros(1,1); % Find and save in 'label' class label from training and testing data sets
label(1,1)=sorted_target(1,1);
j=1;
for i = 2:(NumberofTrainingData+NumberofTestingData)
if sorted_target(1,i) ~= label(1,j)
j=j+1;
label(1,j) = sorted_target(1,i);
end
number_class=j;
NumberofOutputNeurons=number_class;
%%%%%%%%%%Processing the targets of training
temp_T=zeros(NumberofOutputNeurons, NumberofTrainingData);
for i = 1:NumberofTrainingData
for j = 1:number_class
if label(1,j) == T(1,i)
break;
end
end
temp_T(j,i)=1;
end
T=temp_T*2-1;
%%%%%%%%%%Processing the targets of testing
temp_TV_T=zeros(NumberofOutputNeurons, NumberofTestingData);
for i = 1:NumberofTestingData
for j = 1:number_class
if label(1,j) == TV.T(1,i)
break;
end
end
temp_TV_T(j,i)=1;
end
TV.T=temp_TV_T*2-1;
end % end if of Elm_Type
%%%%%%%%%%% Calculate weights & biases start_time_train=cputime;
%%%%%%%%%%% Random generate input weights InputWeight (w_i) and biases BiasofHiddenNeurons (b_i) of hidden neurons InputWeight=rand(NumberofHiddenNeurons,NumberofInputNeurons)*2-1; BiasofHiddenNeurons=rand(NumberofHiddenNeurons,1); tempH=InputWeight*P; clear P; % Release input of training data ind=ones(1,NumberofTrainingData); BiasMatrix=BiasofHiddenNeurons(:,ind); % Extend the bias matrix BiasofHiddenNeurons to match the demention of H tempH=tempH+BiasMatrix;
%%%%%%%%%%% Calculate hidden neuron output matrix H switch lower(ActivationFunction) case {'sig','sigmoid'} %%%%%%%% Sigmoid H = 1 ./ (1 + exp(-tempH)); case {'sin','sine'} %%%%%%%% Sine H = sin(tempH); case {'hardlim'} %%%%%%%% Hard Limit H = double(hardlim(tempH)); case {'tribas'} %%%%%%%% Triangular basis function H = tribas(tempH); case {'radbas'} %%%%%%%% Radial basis function H = radbas(tempH); %%%%%%%% More activation functions can be added here end clear tempH; % Release the temparary array for calculation of hidden neuron output matrix H
%%%%%%%%%%% Calculate output weights OutputWeight (beta_i) OutputWeight=pinv(H') * T'; % implementation without regularization factor //refer to 2006 Neurocomputing paper %OutputWeight=inv(eye(size(H,1))/C+H * H') * H * T'; % faster method 1 //refer to 2012 IEEE TSMC-B paper %implementation; one can set regularizaiton factor C properly in classification applications %OutputWeight=(eye(size(H,1))/C+H * H') \ H * T'; % faster method 2 //refer to 2012 IEEE TSMC-B paper %implementation; one can set regularizaiton factor C properly in classification applications
%If you use faster methods or kernel method, PLEASE CITE in your paper properly:
%Guang-Bin Huang, Hongming Zhou, Xiaojian Ding, and Rui Zhang, "Extreme Learning Machine for Regression and Multi-Class Classification," submitted to IEEE Transactions on Pattern Analysis and Machine Intelligence, October 2010.
end_time_train=cputime; TrainingTime=end_time_train-start_time_train % Calculate CPU time (seconds) spent for training ELM
%%%%%%%%%%% Calculate the training accuracy Y=(H' * OutputWeight)'; % Y: the actual output of the training data
TrainingAccuracy=sqrt(mse(T - Y)) % Calculate training accuracy (RMSE) for regression case
clear H;
%%%%%%%%%%% Calculate the output of testing input start_time_test=cputime; tempH_test=InputWeight*TV.P; clear TV.P; % Release input of testing data ind=ones(1,NumberofTestingData); BiasMatrix=BiasofHiddenNeurons(:,ind); % Extend the bias matrix BiasofHiddenNeurons to match the demention of H tempH_test=tempH_test + BiasMatrix; switch lower(ActivationFunction) case {'sig','sigmoid'} %%%%%%%% Sigmoid H_test = 1 ./ (1 + exp(-tempH_test)); case {'sin','sine'} %%%%%%%% Sine H_test = sin(tempH_test); case {'hardlim'} %%%%%%%% Hard Limit H_test = hardlim(tempH_test); case {'tribas'} %%%%%%%% Triangular basis function H_test = tribas(tempH_test); case {'radbas'} %%%%%%%% Radial basis function H_test = radbas(tempH_test); %%%%%%%% More activation functions can be added here end TY=(H_test' * OutputWeight)'; % TY: the actual output of the testing data end_time_test=cputime; TestingTime=end_time_test-start_time_test % Calculate CPU time (seconds) spent by ELM predicting the whole testing data
TestingAccuracy=sqrt(mse(TV.T - TY))
m1=TV.T
s2=TY% Calculate testing accuracy (RMSE) for regression case
end
% if Elm_Type == CLASSIFIER % %%%%%%%%%% Calculate training & testing classification accuracy % MissClassificationRate_Training=0; % MissClassificationRate_Testing=0; % % for i = 1 : size(T, 2) % [x, label_index_expected]=max(T(:,i)); % [x, label_index_actual]=max(Y(:,i)); % if label_index_actual~=label_index_expected % MissClassificationRate_Training=MissClassificationRate_Training+1; % end % end % TrainingAccuracy=1-MissClassificationRate_Training/size(T,2) % for i = 1 : size(TV.T, 2) % [x, label_index_expected]=max(TV.T(:,i)); % [x, label_index_actual]=max(TY(:,i)); % if label_index_actual~=label_index_expected % MissClassificationRate_Testing=MissClassificationRate_Testing+1; % end % end % TestingAccuracy=1-MissClassificationRate_Testing/size(TV.T,2) % end
댓글 수: 0
답변 (1개)
BERGHOUT Tarek
2019년 2월 3일
you can use this function it works faster
https://www.mathworks.com/matlabcentral/fileexchange/66013-very-very-simple-extreme-learning-machine-algorithm-in-5-lines?s_tid=prof_contriblnk
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Statistics and Machine Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!