It has been normalized a little fuzzy on the data characteristics, today the review process by the algorithm, summarizes the specific purpose and the way it normalized.
Concept: normalization feature value, eliminate the effects caused by different order of magnitude between features. Normalization is to take the data you need to deal with after treatment limited to a certain range you need (via an algorithm). First of normalization is to facilitate later data processing, followed by Yasumasa accelerate the convergence program is running.
the way:
1. Conversion linear function
y=(x-MinValue)/(MaxValue-MinValue)
Description: x, y, respectively before the conversion, the values, MaxValue, MinValue respectively maximum and minimum of the sample.
2. logarithmic conversion function, expression is as follows:
y = log10 (x), Description: base 10 logarithm conversion function.
3. inverse cotangent conversion function, expression is as follows:
y=arctan(x)*2/PI
In statistics, the normalized specific role of the statistical distribution of the induction unified sample. Normalization between 0 and 1 is the statistical probability distribution of a return -1 - + 1 is between coordinates statistical distribution.
Data normalization is necessary, the data can speak different features normalized to a next extent, facilitate the processing of data.