이 제출물을 팔로우합니다
- 팔로우하는 게시물 피드에서 업데이트를 확인할 수 있습니다
- 정보 수신 기본 설정에 따라 이메일을 받을 수 있습니다
A new meta-heuristic algorithm called the Mountain Gazelle Optimizer (MGO) was developed in part as a result of wild mountain gazelles' social structure but suffered from slow convergence speed. Consequently, a modified MGO (mMGO) approach uses the Joint Opposite Selection (JOS) operator, which combines the Selective Leading Opposition (SLO) and the Dynamic Opposite Learning (DO) approaches, to improve MGO. The purpose of this study is to evaluate the performance of mMGO based on the k-Nearest Neighbor (kNN) classifier in predicting brain stroke in data sets taken from Kaggle. Performance was assessed on the challenging CEC 2020 benchmark test functions. Compared to seven well-known optimization algorithms, the statistical results demonstrated the superiority of mMGO. Furthermore, the experimental results of mMGO-kNN for categorizing brain stroke data sets revealed that it outperformed competitors in all data sets with an overall accuracy of 95.5\%, a sensitivity of 99.34\%, a specificity of 98.99\%, and a precision of 99.21\%.
인용 양식
Prof. Dr. Essam H Houssein (2026). mMGO for Brain Stroke Classification (https://kr.mathworks.com/matlabcentral/fileexchange/157441-mmgo-for-brain-stroke-classification), MATLAB Central File Exchange. 검색 날짜: .
| 버전 | 퍼블리시됨 | 릴리스 정보 | Action |
|---|---|---|---|
| 1.0.0 |
