[] Machine learning constrained linear regression

Consider a linear regression model:
Y = X b + e y=X\beta+\varepsilon where e \varepsilon for the residuals of the model. In the linear regression model with no constraints, we use the least squares method, hope and minimum residual sum of squares. which is m i n ( [ ( y X β ) ] 2 ) min (\ sum [(yX \ beta)] ^ 2)

On this basis, plus a set of linear constraints:
Here Insert Picture Description
written in matrix form:
R β = q R \ beta = q

Linear regression problem with constraint can be described as follows:

m i n ( [ ( y X β ) ] 2 ) min (\ sum [(yX \ beta)] ^ 2)
s . t . R β = q s.t.R\beta=q
using Lagrange multiplier method:
Here Insert Picture Description
for β , λ \beta,\lambda derivative, after a series of calculations, resulting β \beta^* And λ \lambda are as follows:
β = β ( X X ) 1 R [ R ( X X ) 1 R ] 1 ( R β q ) \beta^*=\beta-(X'X)^{-1}R'[R(X'X)^{-1}R']^{-1}(R\beta-q)
λ = [ R ( X X ) 1 R ] 1 ( R β q ) \lambda=[R(X'X)^{-1}R']^{-1}(R\beta-q)
wherein β \beta is a parameter value without constraint.

Guess you like

Origin blog.csdn.net/yao09605/article/details/92795743