How does the multiclass logistic regression's optimization work?
조회 수: 8 (최근 30일)
이전 댓글 표시
Hi all,
I am working on a 3-class classification problem (call our classes 1, 2, and 3) with two different datasets (call them A and B). I would like to use a multi-class logistic regression algorithm for this purpose. When I applied fitcecoc with a linear template logistic regression to Dataset A, I got results I would consider normal. However, when I applied the same code to Dataset B, all entries were classified as Class 1.
I was thinking that it may be the optimization that's the issue so I tried a few different initial conditions. I have p variables that I am training on. The default initial conditions for the linear template logistic regression are zeros(1,p). When I changed the initial conditions to a ones vector scaled by a large and positive constant, nothing changed. However, when I changed the initial conditions to a ones vector scaled by a negative and "large enough" (less than -0.1) constant, all classifications switched from Class 1 to Class 3.
What confuses me most is that this same code was working for a similarly structured dataset. Is there something wrong with my approach? Or am I dealing with some sort of convergence issue?
The relevant pieces of code are given below.
initial_condition = -50;
model_template = templateLinear('Learner', 'logistic', 'Beta', initial_condition*ones(8,1));
GLM = fitcecoc(train_data, categorical(train_label), 'Learners', model_template)
yfit = predict(GLM, test_data);
cm = confusionmat(categorical(test_label), yfit, 'Order', categorical(1,2,3));
댓글 수: 0
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Linear Regression에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!