필터 지우기
필터 지우기

How to decide inputs and targets for neural networks for a signature verification system?

조회 수: 1 (최근 30일)
Hi am doing project on offline signataure verification using neural network. I have prepared the database of 100 signatures(5 genuine and 5 forge signatures of each of the 10 person) and extracted 15 global features of each signature.I have normalized each feature in range of(0 1). But I dont know how to train the neural network so that it can recognize the genuine and forge signatures.
  댓글 수: 1
Luis Flores
Luis Flores 2013년 1월 22일
편집: Luis Flores 2013년 1월 22일
Hi:
Here you have a link to a post that describes an off-line system like yours that uses a MultiLayer Perceptron for classification.
You can read the document to see how the topology and the trainign were chosen and done.
Regarding the topology, you can follow an approach that indicates that you need an output node per class, this means you would need two nodes in your output layer. The network should need to learn to fire the first one with authentic signatures and the second one with forgeries.
Regarding the hidden layer, there are no rules to follow. There are a lot of suggestions on how to calculate them but it is actually an empiric work so you can try many suggestions or combinations and use the one that gave you best results. In my case, working with 12 features, 4 nodes did a very good job.
Hope this helps...

댓글을 달려면 로그인하십시오.

채택된 답변

Greg Heath
Greg Heath 2012년 8월 8일
편집: Greg Heath 2013년 1월 23일
For an I-H-O = 15-H-20 classifier with N = 100
[ I N ] = size(input)
[ O N ] = size(target)
Columns of target are columns of eye(20) with the "1" indicating the correct class.
Use patternnet, tansig, softmax, and trainscg.
See the examples and demos in the documentation.
If you accept the default data division ratio of trn/val/tst = 0.7/0.15/0.15, the O = 20 outputs will yield Ntrneq = 0.7*N*O = 1400 training equations to estimate Nw = (I+1)*H+(H+1)*O = 20+(15+20+1)*H = 20+36*H unknown weights.
If you do not use validation stopping or regularization (msereg) a reasonable rule of thumb is Ntrneq >> Nw or
H << (Ntrneq-O)/(I+O+1) = (1400-20)/36 = ~38
Using at least, a factor of 2 yields H < ~ 19. I would form a double loop over candidate values for H (e.g., 1:2:19 outer loop) and an inner loop of Ntrials ~ 10 random weight initializations for each candidate value of H.
Then I would choose the net with the lowest validation error rate.
Finally I would predict the error rate on unseen data using the test error rate.
I have posted many examples in both NEWSGROUP and ANSWERS. Searching on
Greg Heath Ntrials Nw
should be sufficient.
Hope this helps.
Thank you for formally accepting my answer.
Greg
  댓글 수: 1
sushant nepal
sushant nepal 2018년 2월 9일
편집: sushant nepal 2018년 2월 9일
Why is Output 'O' 20 ?? And why is N 100? There are only 50 genuine signatures. Do input and target dataset both include forged signatures as well ? I kinda need help over here in this post Thanks.

댓글을 달려면 로그인하십시오.

추가 답변 (1개)

Luis Flores
Luis Flores 2013년 1월 22일
Hi:
Here you have a link to a post that describes an off-line system like yours that uses a MultiLayer Perceptron for classification.
You can read the document to see how the topology and the trainign were chosen and done.
Regarding the topology, you can follow an approach that indicates that you need an output node per class, this means you would need two nodes in your output layer. The network should need to learn to fire the first one with authentic signatures and the second one with forgeries.
Regarding the hidden layer, there are no rules to follow. There are a lot of suggestions on how to calculate them but it is actually an empiric work so you can try many suggestions or combinations and use the one that gave you best results. In my case, working with 12 features, 4 nodes did a very good job.
Hope this helps...

카테고리

Help CenterFile Exchange에서 Pattern Recognition and Classification에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by