Machine learning - to a simple linear regression of quantization (code for) Machine Learning - simple linear regression (principle with + + code to achieve algorithmic descriptions)

Least Square Method:

  Machine Learning - simple linear regression (the principle of derivation + + code that implements the algorithm described)

 

Vector code implements:

  Simply means that the calculated molecular denominator for least squares method using the vector into operation, so that greatly improved performance

  The difference between a blog that 19, 20 lines of code.

. 1  class SimpleLinearRegression2:
 2      DEF  the __init__ (Self):
 . 3          "" " initialize Simple Linear Regression Model " "" 
. 4          self.a_ = None
 . 5          self.b_ = None
 . 6          # A and parameter b is not sent by the user, are derived results 
. 7      # x_train y_train and is designed to provide training, training the parameters obtained after the desired data is useless 
. 8  
. 9      DEF Fit (Self, x_train, y_train):
 10          "" " the training data set x_train, y_train training Simple Linear Regression model "" " 
11          the Assert x_train.ndim == 1 , \
 12              "Simple Linear Regressor can only solve simple feature training data"
13         assert len(x_train) == len(y_train), \
14             "the size of x_train must be equal to the size of y_train"
15         #算法实现代码
16         x_mean = np.mean(x_train)
17         y_mean = np.mean(y_train)
18 
19         num = (x_train - x_mean).dot(y_train - y_mean)
20         d = (x_train - x_mean).dot(x_train - x_mean)
21 
22         self.a_ = num / d
23         self.b_ = y_mean - self.a_ *x_mean
 24  
25          return Self
 26 is      
27      DEF Predict (Self, x_predict):
 28          "" " of a given data set to be predicted x_predict, x_predict returns a result vector of " "" 
29          Assert x_predict.ndim ==. 1 , \
 30              " the Simple Linear Feature Solve SINGLE CAN only Regressor Training Data " 
31 is          Assert self.a_ IS  Not None and self.b_ IS  Not None, \
 32              " MUST before Predict Fit " 
33 is  
34 is          return np.array ([self._predict (X)for X in x_predict])
 35  
36      DEF _predict (Self, x_single):
 37 [          "" " given a single data to be predicted x_single, x_single return prediction result value of " "" 
38 is          return self.a_ x_single * + self.b_
 39  
40      DEF  __repr__ (Self):
 41 is          return  " SimpleLinearRegression1 () "


Verify that jupyter notebook performance in it:

  reg1 is represented by using an algorithm for calculating the linear regression loop

  reg2 is represented using linear regression algorithm Vector

  From the operating results we can see that the vector method for greatly improved performance.

  

 

Guess you like

Origin www.cnblogs.com/miaoqianling/p/11423594.html