On-line regression On-line learning algorithms are not restricted to classification problems. The update rule for the kernel adatron algorithm also suggests a general methodology for creating on-line versions of the optimisations.
making the first update of the kernel adatron algorithm equivalent to αi ← αi + ∂W(α) ∂αi making it a simple gradient ascent algorithm augmented with corrections to ensure that the additional constraints are satisfied. If, for example, we apply this same approach to the linear ε-insensitive loss version of the support vector regression algorithm.
One of the advantages of Support Vector Machine, and Support Vector Regression as the part of it, is that it can be used to avoid difficulties of using linear functions in the high dimensional feature space and optimization problem is transformed into dual convex quadratic programmes. In regression case the loss function is used to penalize errors that are grater than threshold - . Such loss functions usually lead to the sparse representation of the decision rule, giving significant algorithmic and representational advantages.
Reference:
Kernel Methods for Pattern Analysis byJohn Shawe-Taylor & Nello Cristianini
http://kernelsvm.tripod.com/
인용 양식
Bhartendu (2024). Support Vector Regression (https://www.mathworks.com/matlabcentral/fileexchange/63060-support-vector-regression), MATLAB Central File Exchange. 검색 날짜: .
MATLAB 릴리스 호환 정보
플랫폼 호환성
Windows macOS Linux카테고리
- AI and Statistics > Statistics and Machine Learning Toolbox > Dimensionality Reduction and Feature Extraction >
태그
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!버전 | 게시됨 | 릴리스 정보 | |
---|---|---|---|
1.0.0.0 |