Optimal (linear) combination of several binary probabilistic classifiers
조회 수: 1 (최근 30일)
이전 댓글 표시
Hello together,
Based on an EEG training set, I have trained several binary probabilistic classifiers (logistic regression) that all try to predict in an independent EEG test set whether a person currently things about a movement (class 1) or not (class 2). All classifiers work above chance level, however they are all far from being perfect. I’m now wondering whether it is somehow possible to optimally combine my classifier’s outputs, in order to obtain only one probability value per trial, however that is more reliable than my individual classifier outputs? My intuitive feeling is that the solution to my problem has something to do with “ensemble methods” (https://en.wikipedia.org/wiki/Ensemble_learning), however, unfortunately I am a bit overwhelmed with all the methods and algorithms being available (and unfortunately my knowledge about machine learning is quite limited).
Could perhaps anyone just give me some advice what method would be most suitable for my purpose (optimal linear combination of different binary probabilistic classifiers) and how I can easily implement it. I’m wondering whether a simple linear regression model would perhaps already do the job for me (however would be probably not the best solution, right)?
I would be very very thankful for any help. Cheers
댓글 수: 0
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 EEG/MEG/ECoG에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!