Support Vector Machine Regression (SVR)

Support vector machine regression (SVR) is an application model of support vector machine in regression problems. There are many variants of SVM regression models based on different loss functions . This article only describes the ϵ SVR model for insensitive loss function.

main idea

Find a separating hyperplane (hypersurface) that minimizes the expected risk.

ϵ -SVR

ϵ - loss function

ϵ - loss function, which is when the error is less than ϵ , the error can be ignored. Conversely, the error is X | ϵ | . As shown in the figure:
loss
based on ϵ - The SVR of the loss function is called ϵ -SVR.
The optimization problem is as follows:epsilon

ν -SVR

same ν - Support vector machine classification, set another parameter ν to adjust the number of support vectors.
The optimization problem is as follows:write picture description here
C , ν It is set freely by the user, so directly set the C / l called C , C ν called ν , the optimization function is equivalent to:

min w , b , X , X , ϵ 1 2 w T w + ν ϵ + C i = 1 l ( X i + X i )

support vector

Intuitively, the support vector is the sample that plays a role in the calculation of the final w and b ( a 0 ). Then according to ϵ Insensitive function image, the insensitive area is like a "pipe". Sample correspondence within the pipeline a = 0 , is the non-support vector; the one located on the "tube wall" is the boundary support vector, 0 < a < C , is the boundary support vector; outside the "pipe" is the non-boundary support vector, a C , is a non-boundary support vector (outliers are often selected from non-boundary support vectors during outlier detection);
Note: The picture is from the LIBSVM guidance document . Please correct me if I am wrong.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325687657&siteId=291194637