이 페이지의 최신 내용은 아직 번역되지 않았습니다. 최신 내용은 영문으로 볼 수 있습니다.

모델 개발과 평가

특징 선택, 초모수 최적화, 교차 검증, 잔차 진단, 플롯

고품질 회귀 모델을 개발하는 경우 적합한 특징(또는 예측 변수)을 선택하고, 초모수(데이터에 피팅되지 않는 모델 모수)를 조율하고, 잔차 진단을 통해 모델 가정을 평가해야 합니다.

초모수의 값을 선택하고 선택한 값을 사용하여 모델에 대한 교차 검증을 반복하는 방식으로 초모수를 조율할 수 있습니다. 이 과정을 통해 여러 개의 모델이 생성되며, 이러한 모델 중에서 가장 적합한 모델이 추정된 일반화 오차를 최소화하는 모델일 수 있습니다. 예를 들어, SVM 모델을 조정하려면 상자 제약 조건과 커널 스케일의 집합을 선택하고, 각 값 쌍에 대해 모델을 교차 검증한 후 10겹 교차 검증된 평균 제곱 오차 추정값과 비교하십시오.

Statistics and Machine Learning Toolbox™에서 제공되는 특정 비모수적 회귀 함수는 베이즈 최적화, 그리드 탐색 또는 임의 탐색을 통해 추가적인 자동 초모수 조율 기능을 제공합니다. 그러나, 베이즈 최적화를 구현하는 주요 함수인 bayesopt는 여러 다른 응용 사례에 사용할 수 있을 정도로 유연합니다. 자세한 내용은 Bayesian Optimization Workflow 항목을 참조하십시오.

회귀 학습기Train regression models to predict data using supervised machine learning

함수

모두 확장

sequentialfsSequential feature selection
relieffRank importance of predictors using ReliefF or RReliefF algorithm
plotPartialDependenceCreate partial dependence plot (PDP) and individual conditional expectation (ICE) plots
stepwiselm Fit linear regression model using stepwise regression
stepwiseglmCreate generalized linear regression model by stepwise regression
bayesoptSelect optimal machine learning hyperparameters using Bayesian optimization
hyperparametersVariable descriptions for optimizing a fit function
optimizableVariableVariable description for bayesopt or other optimizers
crossvalLoss estimate using cross-validation
cvpartition데이터에 대한 교차 검증 분할 생성
repartitionRepartition data for cross-validation
testTest indices for cross-validation
trainingTraining indices for cross-validation
coefCIConfidence intervals of coefficient estimates of linear regression model
coefTestLinear hypothesis test on linear regression model coefficients
dwtestDurbin-Watson test with linear regression model object
plotScatter plot or added variable plot of linear regression model
plotAddedAdded variable plot of linear regression model
plotAdjustedResponseAdjusted response plot of linear regression model
plotDiagnosticsPlot observation diagnostics of linear regression model
plotEffectsPlot main effects of predictors in linear regression model
plotInteractionPlot interaction effects of two predictors in linear regression model
plotResidualsPlot residuals of linear regression model
plotSlicePlot of slices through fitted linear regression surface
coefCIConfidence intervals of coefficient estimates of generalized linear model
coefTestLinear hypothesis test on generalized linear regression model coefficients
devianceTestAnalysis of deviance
plotDiagnosticsPlot diagnostics of generalized linear regression model
plotResidualsPlot residuals of generalized linear regression model
plotSlicePlot of slices through fitted generalized linear regression surface
coefCIConfidence intervals of coefficient estimates of nonlinear regression model
coefTestLinear hypothesis test on nonlinear regression model coefficients
plotDiagnosticsPlot diagnostics of nonlinear regression model
plotResidualsPlot residuals of nonlinear regression model
plotSlicePlot of slices through fitted nonlinear regression surface
linhyptestLinear hypothesis test

객체

모두 확장

BayesianOptimizationBayesian optimization results
cvpartitionData partitions for cross validation

도움말 항목

회귀 학습기 앱 워크플로

Train Regression Models in Regression Learner App

Workflow for training, comparing and improving regression models, including automated, manual, and parallel training.

Choose Regression Model Options

In Regression Learner, automatically train a selection of models, or compare and tune options of linear regression models, regression trees, support vector machines, Gaussian process regression models, and ensembles of regression trees.

Feature Selection and Feature Transformation Using Regression Learner App

Identify useful predictors using plots, manually select features to include, and transform features using PCA in Regression Learner.

Assess Model Performance in Regression Learner

Compare model statistics and visualize results.

특징 선택

Feature Selection

Learn about feature selection algorithms, such as sequential feature selection.

초모수 최적화

Bayesian Optimization Workflow

Perform Bayesian optimization using a fit function or by calling bayesopt directly.

Variables for a Bayesian Optimization

Create variables for Bayesian optimization.

Bayesian Optimization Objective Functions

Create the objective function for Bayesian optimization.

Constraints in Bayesian Optimization

Set different types of constraints for Bayesian optimization.

Optimize a Boosted Regression Ensemble

Minimize cross-validation loss of a regression ensemble.

Bayesian Optimization Plot Functions

Visually monitor a Bayesian optimization.

Bayesian Optimization Output Functions

Monitor a Bayesian optimization.

Bayesian Optimization Algorithm

Understand the underlying algorithms for Bayesian optimization.

Parallel Bayesian Optimization

How Bayesian optimization works in parallel.

교차 검증

Implement Cross-Validation Using Parallel Computing

Speed up cross-validation using parallel computing.

선형 모델 진단

선형 회귀 결과 해석하기

선형 회귀 결과 출력되는 통계량을 표시하고 해석합니다.

선형 회귀

선형 회귀 모델을 피팅하고 결과를 검토합니다.

Linear Regression with Interaction Effects

Construct and analyze a linear regression model with interaction effects and interpret the results.

Summary of Output and Diagnostic Statistics

Evaluate a fitted model by using model properties and object functions

F-statistic and t-statistic

In linear regression, the F-statistic is the test statistic for the analysis of variance (ANOVA) approach to test the significance of the model or the components in the model. The t-statistic is useful for making inferences about the regression coefficients

결정계수(R 제곱)

결정계수(R 제곱)는 선형 회귀 모델에서 독립 변수 X로 설명되는 응답 변수 y의 비례적인 변동량을 나타냅니다.

Coefficient Standard Errors and Confidence Intervals

Estimated coefficient variances and covariances capture the precision of regression coefficient estimates.

Residuals

Residuals are useful for detecting outlying y values and checking the linear regression assumptions with respect to the error term in the regression model.

Durbin-Watson Test

The Durbin-Watson test assesses whether there is autocorrelation among the residuals or not.

Cook’s Distance

Cook's distance is useful for identifying outliers in the X values (observations for predictor variables).

Hat Matrix and Leverage

The hat matrix provides a measure of leverage.

Delete-1 Statistics

Delete-1 change in covariance (covratio) identifies the observations that are influential in the regression fit.

일반화 선형 모델 진단

Generalized Linear Models

Generalized linear models use linear methods to describe a potentially nonlinear relationship between predictor terms and a response variable.

비선형 모델 진단

Nonlinear Regression

Parametric nonlinear models represent the relationship between a continuous response variable and one or more continuous predictor variables.