tf.keras.Sequential build regression model

First, we should understand the essential difference between classification model and regression model, can be handy when building the model.

  • Classification model: predictions are categories, output of the model is a probability distribution on each category. Therefore, the number of output values ​​classification model is the final layer of the plurality.
  • Model prediction: predict the value output of the model is a real value. Therefore, the number of output values ​​on the regression model is a final layer.

After a clear distinction between classification model and prediction models, we have an example to explain the process to build a regression model. This example is a rate prediction model, the data set is a data set sklearn.datasets of fetch_california_housing. We still will build a process model is divided into seven steps:

A: Import packages

 1 import numpy as np
 2 import pandas as pd
 3 import matplotlib as mpl
 4 import matplotlib.pyplot as plt
 5 import sklearn
 6 
 7 import os
 8 import sys
 9 import time
10 
11 import tensorflow as tf
12 import tensorflow.keras as keras
13 
14 for module in np, pd, mpl, sklearn, tf, keras:
15     print (module.__name__, module.__version__)

Two: Loading

1 # 数据加载
2 from sklearn.datasets import fetch_california_housing
3 housing = fetch_california_housing(data_home='C:/Users/rencm/scikit_learn_data', download_if_missing=True)

Three: segmentation training and validation sets

1 # 切分训练集和验证集
2 from sklearn.model_selection import train_test_split
3 x_train_all, x_test, y_train_all, y_test = train_test_split(housing.data, housing.target, random_state = 7)
4 x_train, x_valid, y_train, y_valid = train_test_split(x_train_all, y_train_all, random_state = 11)
5 print (x_train.shape, x_valid.shape, x_test.shape)

Four: Enter data normalization process

1  # input data normalized 
2  from sklearn.preprocessing Import StandardScaler
 . 3 Scaler = StandardScaler ()
 . 4 x_train_scaled = scaler.fit_transform (x_train)
 . 5 x_valid_scaled = scaler.transform (X_valid)
 . 6 x_test_scaled = scaler.transform (x_test)

Five: model

 1 # 模型建立
 2 model = keras.models.Sequential([
 3     keras.layers.Dense(30, activation = 'relu', input_shape = x_train.shape[1:]),
 4     keras.layers.Dense(1)
 5 ])
 6 model.compile(
 7     loss = 'mean_squared_error',
 8     optimizer = 'sgd'
 9 )
10 print (model.summary())

This refers to a little note: the number of output last layer that is our output layer is 1, this is the only difference between a classification model.

Six: Trainer

# Training model 
History = model.fit (
    x_train_scaled, y_train,
    validation_data=(x_valid_scaled, y_valid),
    epochs = 100,
    callbacks=[
        keras.callbacks.EarlyStopping(min_delta = 1e-2, patience = 5)
    ]
)

Seven: Test Model

1  # test model 
2 model.evaluate (x_test_scaled, android.permission.FACTOR.)

 to sum up:

As can be seen by the above process, the general procedure classification and regression model are the same, only a little different when the established model. That classification model output layer having a plurality of output, use softmax activation function, and the output layer only one output regression model, no activation function.

Guess you like

Origin www.cnblogs.com/renxiansheng/p/12526663.html