Keras functional API introduction

参考文献:Géron, Aurélien. Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems. O'Reilly Media, 2019.

Keras' Sequential sequence model can quickly build a simple neural network, and Keras also provides a functional API (Functional API) for customizing various types of grid structures.

Concatenate

When building the Wide & Deep neural grid, layer concatenation is required. The fusion layer in the figure adds the Input layer and the last layer of the hidden layer together.

input_ = keras.layers.Input(shape=X_train.shape[1:])
hidden1 = keras.layers.Dense(30, activation="relu")(input_)
hidden2 = keras.layers.Dense(30, activation="relu")(hidden1)
concat = keras.layers.concatenate([input_, hidden2])
output = keras.layers.Dense(1)(concat)
model = keras.models.Model(inputs=[input_], outputs=[output])

Multi-inputs

The input features can be divided into multiple groups (there can be overlapping parts), and let them pass through different paths in the neural network.

input_A = keras.layers.Input(shape=[5], name="wide_input")
input_B = keras.layers.Input(shape=[6], name="deep_input")
hidden1 = keras.layers.Dense(30, activation="relu")(input_B)
hidden2 = keras.layers.Dense(30, activation="relu")(hidden1)
concat = keras.layers.concatenate([input_A, hidden2])
output = keras.layers.Dense(1, name="output")(concat)
model = keras.models.Model(inputs=[input_A, input_B], outputs=[output])

Multi-outputs

input_A = keras.layers.Input(shape=[5], name="wide_input")
input_B = keras.layers.Input(shape=[6], name="deep_input")
hidden1 = keras.layers.Dense(30, activation="relu")(input_B)
hidden2 = keras.layers.Dense(30, activation="relu")(hidden1)
concat = keras.layers.concatenate([input_A, hidden2])
output = keras.layers.Dense(1, name="main_output")(concat)
aux_output = keras.layers.Dense(1, name="aux_output")(hidden2)
model = keras.models.Model(inputs=[input_A, input_B],
                           outputs=[output, aux_output])

Each output can set the loss function separately

model.compile(loss=[“mse”,”mse”], loss_weights=[0.9, 0.1], optimizer=“sgd”)

If not set, Keras uses the same loss function by default. During training, Keras will calculate two loss functions separately and add them together to get the final loss value.

Guess you like

Origin www.cnblogs.com/yaos/p/12739939.html