Explanation: This is a machine learning practical project (with data + code + documentation + video explanation). If you need data + code + documentation + video explanation, you can go directly to the end of the article to get it.
1. Project Background
In 2019, Heidari et al. proposed the Harris Hawk Optimization (HHO), which has a strong global search capability and has the advantage of requiring fewer parameters to be adjusted.
This project uses the HHO Harris Eagle optimization algorithm to find the optimal parameter values to optimize the LightGBM regression model.
2. Data acquisition
The modeling data for this time comes from the Internet (compiled by the author of this project), and the statistics of the data items are as follows:
The data details are as follows (partial display):
3. Data preprocessing
3.1 View data with Pandas tools
Use the head() method of the Pandas tool to view the first five rows of data:
key code:
3.2 Data missing view
Use the info() method of the Pandas tool to view data information:
As can be seen from the above figure, there are a total of 9 variables, no missing values in the data, and a total of 1000 data.
key code:
3.3 Data descriptive statistics
Use the describe() method of the Pandas tool to view the mean, standard deviation, minimum, quantile, and maximum of the data.
The key code is as follows:
4. Exploratory Data Analysis
4.1 Histogram of y variables
Use the hist() method of the Matplotlib tool to draw a histogram:
As can be seen from the figure above, the y variable is mainly concentrated between -300 and 300.
4.2 Correlation analysis
As can be seen from the figure above, the larger the value, the stronger the correlation. A positive value is a positive correlation, and a negative value is a negative correlation.
5. Feature Engineering
5.1 Establish feature data and label data
The key code is as follows:
5.2 Dataset splitting
Use the train_test_split() method to divide according to 80% training set and 20% test set. The key code is as follows:
6. Construct the HHO Harris Eagle optimization algorithm to optimize the LightGBM regression model
Mainly use the HHO Harris Eagle optimization algorithm to optimize the LightGBM regression algorithm for target regression.
6.1 Optimal parameters searched by HHO Harris Eagle optimization algorithm
key code:
Process data for each iteration:
Optimal parameters:
6.2 Optimal parameter value construction model
7. Model Evaluation
7.1 Evaluation indicators and results
The evaluation indicators mainly include explainable variance value, mean absolute error, mean square error, R square value and so on.
It can be seen from the above table that the R square is 0.9597, which means that the model works well.
The key code is as follows:
7.2 Comparison chart of actual value and predicted value
From the above figure, it can be seen that the fluctuations of the actual value and the predicted value are basically the same, and the model fitting effect is good.
8. Conclusion and Outlook
To sum up, this paper uses the HHO Harris Eagle optimization algorithm to find the optimal parameter values of the LightGBM regression model to construct the regression model, and finally proves that the model we proposed works well. This model can be used for forecasting of everyday products.
# 定义转换函数
def binary_conversion(X, thres, N, dim):
Xbin = np.zeros([N, dim], dtype='int') # 位置初始化为0
for i in range(N): # 循环
for d in range(dim): # 循环
if X[i, d] > thres: # 判断
Xbin[i, d] = 1 # 赋值
else:
Xbin[i, d] = 0 # 赋值
return Xbin # 返回数据
# ******************************************************************************
# 本次机器学习项目实战所需的资料,项目资源如下:
# 项目说明:
# 链接:https://pan.baidu.com/s/1c6mQ_1YaDINFEttQymp2UQ
# 提取码:thgk
# ******************************************************************************
# 定义错误率计算函数
def error_rate(X_train, y_train, X_test, y_test, x, opts):
if abs(x[0]) > 0: # 判断取值
n_estimators = int(abs(x[0])) + 100 # 赋值
else:
n_estimators = int(abs(x[0])) + 100 # 赋值
if abs(x[1]) > 0: # 判断取值
learning_rate = (int(abs(x[1])) + 1) / 10 # 赋值
else:
learning_rate = (int(abs(x[1])) + 1) / 10 # 赋值
For more project practice, see the list of machine learning project practice collections:
List of actual combat collections of machine learning projects