Am I computing cross entropy incorrectly?
이전 댓글 표시
I am working on a neural network and would like to use cross entropy as my error function. I noticed from a previous question that MATLAB added this functionality starting with R2013b. I decided to test the crossentropy function by running the simple example provided in the documentation. The code is reprinted below for convenience:
[x,t] = iris_dataset;
net = patternnet(10);
net = train(net,x,t);
y = net(x);
perf = crossentropy(net,t,y)
When I run this code, I get perf = 0.0367. To verify this result, I ran the code:
ce = -mean(sum(t.*log(y)+(1-t).*log(1-y)))
which resulted in ce = 0.1100. Why are perf and ce unequal? Do I have an error in my calculation?
채택된 답변
추가 답변 (3개)
Greg Heath
2014년 8월 21일
You are using the Xent form for outputs and targets that do not have to sum to 1. The corresponding output transfer function is logsig.
For targets that are constrained to sum to 1, use softmax and the first tern of the sum.
For extensive discussions search in comp.ai.neural-nets using
greg cross entropy
Hope this helps.
Thank you for formally accepting my answer
Greg
댓글 수: 2
Matthew Eicholtz
2014년 8월 21일
편집: Matthew Eicholtz
2014년 8월 21일
Greg Heath
2014년 8월 21일
You are welcome for the reply. It did answer your question.
The next time you check make sure that you initialize the RNG before you train so that you can duplicate your calculation.
Or Shamir
2017년 9월 23일
ce = -t .* log(y);
perf = sum(ce(:))/numel(ce);
댓글 수: 1
Greg Heath
2017년 9월 26일
isn't that the same as
perf = mean(ce(:)); % ?
Tian Li
2017년 10월 13일
0 개 추천
ce = -t .* log(y); perf = sum(ce(:))/numel(ce);
This is the right answer for muti-class classification error problem
댓글 수: 1
Greg Heath
2017년 10월 15일
Why do you think that is different from the last 2 answers???
카테고리
도움말 센터 및 File Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!