Click Analysis->Regression->Linearity will come out as shown in the figure
Select independent variable, dependent variable. Click on the left and then click to select a variable and add it to independent variables, dependent variables.
Click Statistics, you need to additionally check collinearity diagnosis and then click Continue, click
Set as shown in the figure .
explain:----------------------------------------------------------------------------
To test whether the residuals are independent or not
Test for collinearity among independent variables
Plot the residuals, x standardized predicted values, y residuals.
--------------------------------------------------------------------------------------------------------------
click OK
Appear
Note: That is , the closer the value is to 2, the more independent each other is.
The residuals are independent normal.
Points that are closer and fall on a straight line indicate a more normal distribution.
Randomly distributed above and below 0, there are not many outliers, there is no trend, and the residuals are stable.
Sig (significance) <0.001 indicates a good result
The closer the R square is to 1, the better
It is the multiple linear regression regression equation y=-33.960+6.199X cap thickness+.....
Indicates the degree of influence of the independent variable on the dependent variable, and the larger the number, the greater the degree of influence.
Multicollinearity judgment method:
A.
Tolerance <0.2 indicates the existence of multicollinearity, VIF (variance expansion coefficient) is: 1/tolerance, half>5 indicates the existence of multicollinearity, and this value varies according to different disciplines.
B.
In the column of eigenvalues, if the extracted eigenvalues of multiple principal components are relatively concentrated on one or several principal components, and the other principal components tend to be 0, there will be multicollinearity. Generally, if the condition index>30, there will be multicollinearity.
If a principal component has a large variance ratio on multiple independent variables at the same time, such as 3, 4, 5, there is multicollinearity.
Multiple linear regression with multicollinearity is inaccurate.
Solution: Stepwise regression analysis
The column of the method is changed to , and the others remain unchanged.
The significance level is set here:
Notice:
See clearly, constants should be excluded, and multicollinearity should be viewed as independent variables
So there is no multicollinearity in this.
Original learning video: multiple linear regression + stepwise regression
The standard coefficient is the direct path coefficient, and the comparison needs to look at the absolute value.