Machine Learning Gradient boosting parameter
조회 수: 6 (최근 30일)
이전 댓글 표시
Hello,
I want to create a gradient boosting machine learning model using matlab with predefined parameter derived from a trained python alogrithm.
Unfortunately I couldnt come up with an idea of how to implement the following python parameter into the matlab function fitrensemble / templatetree:
colsample_bytree': 0.8, 'feature_fraction': 0.7, 'learning_rate': 0.01, 'max_depth': 8, 'metric': 'rmse', 'min_child_samples': 40, 'num_leaves': 125, 'reg_alpha': 0.3, 'reg_lambda': 0.5, 'subsample': 0.9}
Any help for the parameter is much appreciated!
Best, Simon
댓글 수: 0
답변 (1개)
TED MOSBY
2024년 2월 28일
Hi Simon,
As I understand from your question, you want to create a gradient boosting model in MATLAB which is equivalent to model you have provided in Python. MATLAB’s function ‘fitrensemble’ or ‘templateTree’ does not have direct equivalents of some of the parameters which you have provided but we can create a similar model using the ‘LSBoost’ method in the ‘fitrensemble’ function as shown below:
% Specify the parameters you have from Python
learnRate = 0.01;
numLearningCycles = 100;
maxNumSplits = 8;
minLeafSize = 40;
numLearners = 125;
regAlpha = 0.3;
regLambda = 0.5;
subsampleRate = 0.9;
featureFraction = 0.8;
customloss = @(Y,Yhat) sqrt(mean((Y-Yhat).^2)); % RMSE
% Create the ensemble model
ensembleModel = fitrensemble(X, Y, ...
'Method', 'LSBoost', ... % Gradient boosting
'NumLearningCycles', numLearningCycles, ...
'LearnRate', learnRate, ...
'CategoricalPredictors', 'all', ...
'Options', options, ...
'NSMethod', 'lsboost', ... % LSBoost method for gradient boosting
'NumTrained', numLearners, ...
'TreeMethod', 'LSBoost', ... % Specify LSBoost for the tree method
'MaxNumSplits', maxNumSplits, ...
'MinLeafSize', minLeafSize, ...
'Surrogate', 'on', ... % Allowing surrogate splits for missing data
'RegAlpha', regAlpha, ...
'RegLambda', regLambda, ...
'SubsampleFraction', subsampleRate, ...
'FResample', featureFraction, … % Fraction of features to resample for each decision tree, may be used in place of the ‘colsample_bytree’ parameter
‘LossFun’, customloss); % you have to define your own loss function
% Display the trained model
disp(ensembleModel);
Feel free to add or remove any parameters as per your model requirements. For more information have a look at the MATLAB documentation for the ‘fitrensemble’ function here:
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Classification에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!