objective function in Bayesian Optimization Algorithm like fitrsvm and fitrgp
조회 수: 6 (최근 30일)
이전 댓글 표시
Hello,
What is the mathematical objective function in the bayesian optimization algorithm? The explanation says that the algorithm like fitrsvm tries to minimize the log(1 + cross-validation loss) but what is the real mathematical formula?
Is it possible to change the objective function to just the MSE?
Thank you!
Dimitri
댓글 수: 0
채택된 답변
Don Mathis
2019년 5월 13일
This page says that the loss defaults to MSE. So that's the loss that's used in the log(1+cvloss) formula. Cross validated loss is the loss summed over all the held-out validation sets. The default when using optimization is 5-fold cross-validation.
There's not an option to change the hyperparameter optimization objective function from log(1+cvloss). You would need to edit the source code to do that. The source file is matlab\toolbox\stats\classreg\+classreg\+learning\+paramoptim\createObjFcn.m. Look for the call to the log1p function.
댓글 수: 3
Don Mathis
2019년 5월 14일
편집: Don Mathis
2019년 5월 14일
Because loss(Mdl,X,Y) is the loss of the final model on the full dataset, while the MinObjective is the log of 1 plus the out-of-sample cross-validated loss. See the kfoldLoss method for documentation of that. If you used 5-fold cross-validation, the kfoldLoss is the summed loss of 5 different models, each on 1/5 of the dataset. It is not the loss of the final model on the full dataset.
antlhem
2021년 5월 29일
Hi, Could take a look into my question? https://uk.mathworks.com/matlabcentral/answers/842800-why-matlab-svr-is-not-working-for-exponential-data-and-works-well-with-data-that-fluctuates?s_tid=prof_contriblnk
추가 답변 (0개)
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!