How to display weight distribution in hidden layers of neural network?
조회 수: 1 (최근 30일)
이전 댓글 표시
![](https://www.mathworks.com/matlabcentral/answers/uploaded_files/167219/image.jpeg)
I have 8 inputs in the input layer.Now i want to display weight distribution of these 8 inputs in hidden layer to observe the importance of features.To make it more clear example is shown in figure ( https://pasteboard.co/GKCpA6Q.png ).I used `plotwb` function of Matlab it didn't display the weights of every input.
Actually i want to look at weights(weights connecting inputs to first hidden layer) . Larger the weight is, the more important the input.
댓글 수: 0
답변 (1개)
Greg Heath
2017년 9월 17일
That will not work. It does not account for the correlations between inputs.
The best way to rank correlated inputs is
1. Use NO HIDDEN LAYERS !
2. Run 10 or more trials each (different random initial weights)
using
a. A single input
b. All inputs except the one in a.
Hope this helps.
Thank you for formally accepting my answer
Greg
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!