Hi,
I've read that it is good practice to normalize data before training a neural network.
There are different ways of normalizing data.
Does the data have to me normalized between 0 and 1? or can it be done using the standardize function - which won't necessarily give you numbers between 0 and 1 and could give you negative numbers.
Many thanks

 채택된 답변

Chandra Kurniawan
Chandra Kurniawan 2012년 1월 10일

1 개 추천

Hi,
I've heard that the artificial neural network training data must be normalized before the training process.
I have a code that can normalize your data into spesific range that you want.
p = [4 4 3 3 4;
2 1 2 1 1;
2 2 2 4 2];
a = min(p(:));
b = max(p(:));
ra = 0.9;
rb = 0.1;
pa = (((ra-rb) * (p - a)) / (b - a)) + rb;
Let say you want to normalize p into 0.1 to 0.9.
p is your data.
ra is 0.9 and rb is 0.1.
Then your normalized data is pa

댓글 수: 4

Greg Heath
Greg Heath 2012년 1월 11일
Demos in the comp.a.neural-nets FAQ indicate that better precision is obtained when the input data is relatively balanced about 0 AND TANSIG (instead of LOGSIG) activation functions are used in hidden layers.
Hope this helps.
Greg
Kaushal Raval
Kaushal Raval 2015년 7월 4일
i want to find out my output in original form how can i find it ? please help me
If you use the standard programs e.g., FITNET, PATTERNNET, TIMEDELAYNET, NARNET & NARXNET,
All of the normalization and de-normalization is done automatically (==>DONWORRIBOUTIT).
All you have to do is run the example programs in, e.g.,
help fitnet
doc fitnet
If you need additional sample data
help nndatasets
doc nndatasets
For more detailed examples search in the NEWSGROUP and ANSWERS. For example
NEWSGROUP 2014-15 all-time
tutorial 58 2575
tutorial neural 16 127
tutorial neural greg 15 58
Hope this helps.
Greg
murat tuna
murat tuna 2019년 3월 22일
Sorry, where is NEWSGROUP?

댓글을 달려면 로그인하십시오.

추가 답변 (4개)

Greg Heath
Greg Heath 2012년 1월 11일

3 개 추천

The best combination to use for a MLP (e.g., NEWFF) with one or more hidden layers is
1. TANSIG hidden layer activation functions
2. EITHER standardization (zero-mean/unit-variance: doc MAPSTD)
OR [ -1 1 ] normalization ( [min,max] => [ -1, 1 ] ): doc MAPMINMAX)
Convincing demonstrations are available in the comp.ai.neural-nets FAQ.
For classification among c classes, using columns of the c-dimensional unit matrix eye(c) as targets guarantees that the outputs can be interpreted as valid approximatations to input conditional posterior probabilities. For that reason, the commonly used normalization to [0.1 0.9] is not recommended.
WARNING: NEWFF automatically uses the MINMAX normalization as a default. Standardization must be explicitly specified.
Hope this helps.
Greg

댓글 수: 4

John
John 2012년 1월 11일
Hi Greg,
Thank you for your detailed response.
I'm only new matlab so to be honest I don't really understand some of it. However I've googled the terms and I think your advising me if I was building my own network?
I intend to use the nprtool GUI to classify data, which is a 2 layed feed forward network. This means that I will have to standardize the inputs before I import them into the tool.
What method would you recommend I that I use to standardize these? The standardize function or between 0 and 1?
Many thanks
John
owr
owr 2012년 1월 11일
I dont have access to the Neural Network Toolbox anymore, but if I recall correctly you should be able to generate code from the nprtool GUI (last tab maybe?). You can use this code to do your work without the GUI, customize it as need be, and also learn from it to gain a deeper understanding.
What I think Greg is referring to above is the fact that the function "newff" (a quick function to initialize a network) uses the built in normalization (see toolbox function mapminmax). If you want to change this, you'll have to make some custom changes. I dont recall if the nprtool uses newff - this can be verified by generating and viewing the code.
This is all from memory as I dont have access to the toolbox anymore - so take my comments as general guidelines, not as absolute.
Good luck.
John
John 2012년 1월 12일
Thank you
Greg Heath
Greg Heath 2012년 1월 13일
Standardization means zero-mean/unit-variance.
My preferences:
1. TANSIG in hidden layers
2. Standardize reals and mixtures of reals and binary.
3. {-1,1} for binary and reals that have bounds imposed by math or physics.
Hope this helps.
Greg

댓글을 달려면 로그인하십시오.

Greg Heath
Greg Heath 2012년 1월 14일

1 개 추천

In general, if you decide to standardize or normalize, each ROW is treated SEPARATELY.
If you do this, either use MAPSTD, MAPMNMX, or the following:
[I N] = size(p)
%STANDARDIZATION
meanp = repmat(mean(p,2),1,N);
stdp = repmat(std(p,0,2),1,N);
pstd = (p-meanp)./stdp ;
%NORMALIZATION
minp = repmat(min(p,[],2),1,N);
maxp = repmat(max(p,[],2),1,N);
pn = minpn +(maxpn-minpn).*(p-minp)./(maxp-pmin);
Hope this helps
Greg

댓글 수: 4

John
John 2012년 1월 16일
Many thanks
fehmi zarzoum
fehmi zarzoum 2017년 5월 24일
hi, Undefined function or variable 'pmin'.
Greg Heath
Greg Heath 2017년 5월 31일
Yeah, should be minp.
electronx engr
electronx engr 2017년 11월 4일
plz can u help me in this that after training with normalized data, how can I get the network (using gensim command) that works on unnormalized input, since I have created and trained the network using normalized input and output?

댓글을 달려면 로그인하십시오.

Sarillee
Sarillee 2013년 3월 25일

0 개 추천

y=(x-min(x))/(max(x)-min(x))
try this...
x is input....
y is the output...
Imran Babar
Imran Babar 2013년 5월 8일

0 개 추천

mu_input=mean(trainingInput); std_input=std(trainingInput); trainingInput=(trainingInput(:,:)-mu_input(:,1))/std_input(:,1);
I hope this will serve your purpose

댓글 수: 2

Greg Heath
Greg Heath 2013년 5월 10일
Not valid for matrix inputs
Abul Fujail
Abul Fujail 2013년 12월 12일
in case of matrix data, the min and max value corresponds to a column or the whole dataset. E.g. i have 5 input columns of data, in this case whether i should choose min/max for each column and do the normalization or min/max form all over the column and calculate.

댓글을 달려면 로그인하십시오.

카테고리

도움말 센터File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

질문:

2012년 1월 10일

댓글:

2019년 3월 22일

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by