Weight analysis - entropy weight method

1. Function

Weight analysis is to use the entropy weight method to output the weight of the importance of the questionnaire survey indicators. According to the definition of information entropy, for an indicator, the entropy value can be used to judge the degree of dispersion of an indicator. The smaller the information entropy value, The greater the degree of dispersion of the index, the greater the impact (ie weight) of the index on the comprehensive evaluation. If the values ​​of a certain index are all equal, the index will not play a role in the comprehensive evaluation. Therefore, the tool of information entropy can be used to calculate the weight of each index and provide a basis for multi-index comprehensive evaluation.

2. Input and output description

Input : At least two or more quantitative variables (positive indicators and negative indicators), generally require the data to be scale data
Output : Input the weight value corresponding to the quantitative variable

3. Case example

For example, the weight analysis of the 8 assessment indicators of the 6 departments it owns is carried out to obtain the weight ratio of each assessment indicator.

4. Modeling steps

Entropy is a concept in information theory and a measure of uncertainty. The greater the amount of information, the smaller the uncertainty, and the smaller the entropy; the smaller the amount of information, the greater the uncertainty, and the greater the entropy. According to the definition of information entropy, the entropy value of an index can be used to judge the dispersion degree of an index. The smaller the entropy value, the greater the dispersion degree of the index, and the greater the impact (ie weight) of the index on the comprehensive evaluation.
1. Normalize each element according to the quantity of each option:
For positive indicators:


For negative indicators:

The ratio yij of the j-th option of the i-th element is:

In the above formula: m is the number of elements considered.

2. The information entropy of the jth option is:

Among them K=1/ ln m, K is a constant.

The information entropy redundancy of the jth option is: d_{j}=1-e_{j}.
3. The weight of each indicator is:

Guess you like

Origin blog.csdn.net/weixin_60466670/article/details/125950498