When comparing with the network output with desired output, if there is error the weight vector w(k) associated with the ith processing unit at the time instant k is corrected (adjusted) as
w(k+1) = w(k) + D[w(k)]
where, D[w(k)] is the change in the weight vector and will be explicitly given for various learning rules.
Perceptron Learning rule is given by:
w(k+1) = w(k) + eta*[ y(k) - sgn(w'(k)*x(k)) ]*x(k)
인용 양식
Bhartendu (2024). Perceptron Learning (https://www.mathworks.com/matlabcentral/fileexchange/63046-perceptron-learning), MATLAB Central File Exchange. 검색됨 .
MATLAB 릴리스 호환 정보
개발 환경:
R2016a
모든 릴리스와 호환
플랫폼 호환성
Windows macOS Linux카테고리
- AI, Data Science, and Statistics > Deep Learning Toolbox > Function Approximation, Clustering, and Control > Function Approximation and Clustering > Define Shallow Neural Network Architectures >
Help Center 및 MATLAB Answers에서 Define Shallow Neural Network Architectures에 대해 자세히 알아보기
태그
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!버전 | 게시됨 | 릴리스 정보 | |
---|---|---|---|
1.0.0.0 |