Kernel custom problem firgp

조회 수: 35 (최근 30일)
Miguel Angel Hombrados Herrera
Miguel Angel Hombrados Herrera 2015년 11월 7일
편집: Gautam Pendse 2015년 11월 18일
I tryed to use fitrgp with custom kernel (locally periodic , 3 parameters). But I get the next error: "The Cholesky factor needed for making predictions cannot be computed. When calling fitrgp, try changing the initial values of 'KernelParameters' and 'Sigma'. Also consider setting 'Standardize' to true and increasing the value of 'SigmaLowerBound'." Otherwise I takes ages to compute. I defined my kernel (product of 2 kernels) as one function kernel (kfcn) and defined the 3 parameters array theta. I have to say I dont have the same problem when I run fitrgp (custom kernel ) with squared exponential oreither periodic function.

답변 (1개)

Gautam Pendse
Gautam Pendse 2015년 11월 9일
편집: Gautam Pendse 2015년 11월 18일
Hi Miguel,
The error that you saw is reported when the gram matrix is close to singular and the noise variance converges to 0. Can you post your code for kernel function evaluation?
Increasing 'SigmaLowerBound' in the call to fitrgp should help. Here is a custom kernel example which uses a sum of squared exponential ARD and squared exponential kernel. Copy/paste the code into an M-file and then publish.
function testcustomkernel()
%%Example data
rng(0,'twister');
N = 100;
X = linspace(0,1,N)';
X = [X,X.^2];
y = 1 + X*[1;2] + sin(20*X*[1;-2]) + 0.2*randn(N,1);
D = size(X,2);
%%Initial values of the _unconstrained_ kernel parameters
% Note how "easy to understand" parameters like sigmaL and sigmaF
% are converted into unconstrained parameters in theta. See
% mykernel below for a description of these parameters and the
% order in which they are concatenated.
sigmaL10 = 0.1*ones(D,1);
sigmaL20 = 0.1;
sigmaF10 = 1;
sigmaF20 = 1;
theta0 = [log(sigmaL10);log(sigmaL20);log(sigmaF10);log(sigmaF20)];
%%Fit the model using custom kernel function
% mykernel defined later in this file is a sum of two kernel
% functions - a squared exponential and a squared exponential ARD.
% Initial value theta0 must be supplied when using a custom kernel
% function.
gpr = fitrgp(X,y,'kernelfunction',@mykernel,'kernelparameters',theta0,'verbose',1)
%%Plot predictions
plot(y,'r');
hold on;
plot(predict(gpr,X),'b')
%%Display kernel parameters
gpr.KernelInformation
gpr.KernelInformation.KernelParameterNames
% These are the estimated parameters.
thetaHat = gpr.KernelInformation.KernelParameters
%%Convert kernel parameters into an easy to understand form
% Convert kernel parameters into length scales and signal
% standard deviations.
params = exp(thetaHat);
sigmaL1 = params(1:D,1);
sigmaL2 = params(D+1,1);
sigmaF1 = params(D+2,1);
sigmaF2 = params(D+3,1);
end
function KMN = mykernel(XM,XN,theta)
%mykernel - Compute sum of squared exponential and squared exponential ARD.
% KMN = mykernel(XM,XN,theta) takes a M-by-D matrix XM, a N-by-D matrix
% XN and computes a M-by-N matrix KMN of kernel products such that
% KMN(i,j) is the kernel product between XM(i,:) and XN(j,:). theta is
% the R-by-1 unconstrained parameter vector for the kernel.
%
% Let theta = [log(sigmaL1);log(sigmaL2);log(sigmaF1);log(sigmaF2)]
%
% where
%
% sigmaL1 = D-by-1 vector of length scales for squared exponential ARD.
% sigmaL2 = scalar length scale for squared exponential.
% sigmaF1 = scalar signal standard deviation for squared exponential ARD.
% sigmaF2 = scalar signal standard deviation for squared exponential.
% 1. Get D. Assume XN, XM have the same number of columns.
D = size(XM,2);
% 2. Convert theta into sigmaL1, sigmaL2, sigmaF1 and sigmaF2.
params = exp(theta);
sigmaL1 = params(1:D,1);
sigmaL2 = params(D+1,1);
sigmaF1 = params(D+2,1);
sigmaF2 = params(D+3,1);
% 3. Create the contribution due to squared exponential ARD.
KMN = pdist2(XM(:,1)/sigmaL1(1), XN(:,1)/sigmaL1(1)).^2;
for r = 2:D
KMN = KMN + pdist2(XM(:,r)/sigmaL1(r), XN(:,r)/sigmaL1(r)).^2;
end
KMN = (sigmaF1^2)*exp(-0.5*KMN);
% 4. Add the contribution due to squared exponential.
KMN = KMN + (sigmaF2^2)*exp(-0.5*(pdist2(XM/sigmaL2, XN/sigmaL2).^2));
end
Gautam

카테고리

Help CenterFile Exchange에서 Gaussian Process Regression에 대해 자세히 알아보기

제품

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by