Keras builds a Wide & Deep neural network

Wide & Deep Neural Network

In 2016, the article by Wide and Deep Learning for Recommender Systems published by Cheng and others of Google introduced a new architecture, Wide & Deep ANNs.

By connecting some or all of the information in the input layer directly to the output layer, simple features can be learned through short paths, and complex features can be learned through deep paths. Compared with the typical multi-layer perceptron (MLP) structure, this architecture can avoid simple features in the data set being over-processed and distorted in the deep path.

Keras implementation

The implementation of Wide & Deep neural network is very simple. Aurélien Géron's classic textbook Hands-On Machine Learning with Scikit-Learn and TensorFlow gives a source code. There is an error in the source code in the book, and the correct one is given here.

input=keras.layers.Input(shape=X_train.shape[1:])
hidden1=keras.layers.Dense(30,activation="relu")(input)
hidden2=keras.layers.Dense(30,activation="relu")(hidden1)
concat=keras.layers.concatenate([input,hidden2])
output=keras.layers.Dense(1)(concat)
model=keras.Model(inputs=[input],outputs=[output])

Guess you like

Origin www.cnblogs.com/yaos/p/12739949.html