SPSS Multiple Linear Regression and Stepwise Regression Tutorial

Click Analysis->Regression->Linearity will come out as shown in the figureff3eb5693870411095e64e1acfef0de1.png

 Select independent variable, dependent variable. Click on the left efdea7ff22ff440898229845090d4fbb.pngand then click fd1844b1f83a4520aa6daa6e1562281c.pngto select a variable and add it to independent variables, dependent variables.

 

Click Statistics, you need to additionally check collinearity diagnosis and 04313f94bed840d6a85d12051039369b.pngthen click Continue, click27ce2cf370c64719a2a77035cb419bcf.png967d42a979564e5b9c498d74141b4fdb.png

Set as shown in the figure 7066eb4ad29a46d5b33c8b125a8ec3c8.png .

explain:----------------------------------------------------------------------------

04313f94bed840d6a85d12051039369b.pngTo test whether the residuals are independent or not

 bc6fb45d16e24139bd97b73ae8d2f9de.pngTest for collinearity among independent variables

99c8f4dd5a5145f78f4f6326260a910d.pngPlot the residuals, x standardized predicted values, y residuals.

--------------------------------------------------------------------------------------------------------------

click OK53a1855194c34ccca0716734ff408c10.png

 Appeare427905bdf784ea8b4e4c7113ccdfd61.png

 Note: 92d0ee5cd0b4429c85a4d1587aae4686.pngThat is 04313f94bed840d6a85d12051039369b.png, the closer the value is to 2, the more independent each other is.

b89c6b2fc4cf4635ba09c742ce68ced6.pngThe residuals are independent normal.

 

 

1e6f1b1e2c4941ed85aeb37c44622a30.pngPoints that are closer and fall on a straight line indicate a more normal distribution.

 

513c0e8438564dafa5984f54fa8bd0ff.pngRandomly distributed above and below 0, there are not many outliers, there is no trend, and the residuals are stable.

 

5999e32dd70849089ce8c842f7aec36a.pngSig (significance) <0.001 indicates a good result

 

6356e7b0e1c64ec9a114dfc56d74599a.pngThe closer the R square is to 1, the better

 

41682894717443f0b2184208287a8deb.pngIt is the multiple linear regression regression equation y=-33.960+6.199X cap thickness+.....

 

f151253f65014a3598a21c3dfa2cbaa7.pngIndicates the degree of influence of the independent variable on the dependent variable, and the larger the number, the greater the degree of influence. 

 

Multicollinearity judgment method:

A.

58e86da31736414f9df44508dbd89458.pngTolerance <0.2 indicates the existence of multicollinearity, VIF (variance expansion coefficient) is: 1/tolerance, half>5 indicates the existence of multicollinearity, and this value varies according to different disciplines.

 

B.

3e72a9aff5494c8ca2a4ec24901b31d5.png

 In the column of eigenvalues, if the extracted eigenvalues ​​of multiple principal components are relatively concentrated on one or several principal components, and the other principal components tend to be 0, there will be multicollinearity. Generally, if the condition index>30, there will be multicollinearity.

If a principal component has a large variance ratio on multiple independent variables at the same time, such as 3, 4, 5, there is multicollinearity.

Multiple linear regression with multicollinearity is inaccurate.

 

 Solution: Stepwise regression analysis

833989cf3d3a4860bfb7824313aa9e22.png

The column of the method is changed to a41dc9e1413f43a4a0d897f11adc91ec.png, and the others remain unchanged.

The significance level is set here:71b8fced9ea5465b9d0aac47705fbe8c.png

 Notice:d1153843ed1643719aaf44a099c100fe.png

 See clearly, constants should be excluded, and multicollinearity should be viewed as independent variables

 a81c8d0c9c5045abae5005a727817263.png

 So there is no multicollinearity in this.

  Original learning video: multiple linear regression + stepwise regression

 

937a83adad19446896406c30a215078b.pngThe standard coefficient is the direct path coefficient, and the comparison needs to look at the absolute value.

 

 

 

 

 

 

 

 

 

 

 

Guess you like

Origin blog.csdn.net/mengzhilv11/article/details/125746289