keras use in Convolution1D

keras use in Convolution1D

This article illustrates two things, one is Convolution1D introduction, the other is model.summary () is used.

First of all let me say at model.summary (), this method can print out the information model, readers can view the contents of each output.

Then say under Convolution1D uses, Convolution1D one-dimensional convolution, is mainly used to filter element adjacent one-dimensional input, official documents like this

keras.layers.convolutional.Convolution1D(nb_filter, filter_length, init='glorot_uniform', activation=None, weights=None, border_mode='valid', subsample_length=1, W_regularizer=None, b_regularizer=None, activity_regularizer=None, W_constraint=None, b_constraint=None, bias=True, input_dim=None, input_length=None) 

Then the official gives examples of this is the

# apply a convolution 1d of length 3 to a sequence with 10 timesteps,
# with 64 output filters
model = Sequential()
model.add(Convolution1D(64, 3, border_mode='same', input_shape=(10, 32)))
# now model.output_shape == (None, 10, 64)

# add a new conv1d on top
model.add(Convolution1D(32, 3, border_mode='same'))
# now model.output_shape == (None, 10, 32)

Then print (model.summary ()) output is as follows:

 

   Here I will briefly around under the above code: When this layer as the first floor, a description input_shape

input_shape = (10, 32) is short 10 of the 32-dimensional vector, nb_filter: the number of dimensions of the convolution kernel, is output. filter_length: length of each filter. 
First we look at the first convolution layer, the output shape is easy to understand, because there are 64 convolution kernel, so the output is 64, then we look argument: in fact can be understood, we have case (10, 2D convolution signal 32) is equivalent to the convolution 1D is subjected to convolution kernel (filter_length, 32) of the

Well, it sauce

 
  

Reproduced in: https: //www.cnblogs.com/qianboping/p/6516639.html

 

This article illustrates two things, one is Convolution1D introduction, the other is model.summary () is used.

First of all let me say at model.summary (), this method can print out the information model, readers can view the contents of each output.

Then say under Convolution1D uses, Convolution1D one-dimensional convolution, is mainly used to filter element adjacent one-dimensional input, official documents like this

keras.layers.convolutional.Convolution1D(nb_filter, filter_length, init='glorot_uniform', activation=None, weights=None, border_mode='valid', subsample_length=1, W_regularizer=None, b_regularizer=None, activity_regularizer=None, W_constraint=None, b_constraint=None, bias=True, input_dim=None, input_length=None) 

Then the official gives examples of this is the

# apply a convolution 1d of length 3 to a sequence with 10 timesteps,
# with 64 output filters
model = Sequential()
model.add(Convolution1D(64, 3, border_mode='same', input_shape=(10, 32)))
# now model.output_shape == (None, 10, 64)

# add a new conv1d on top
model.add(Convolution1D(32, 3, border_mode='same'))
# now model.output_shape == (None, 10, 32)

Then print (model.summary ()) output is as follows:

 

   Here I will briefly around under the above code: When this layer as the first floor, a description input_shape

input_shape = (10, 32) is short 10 of the 32-dimensional vector, nb_filter: the number of dimensions of the convolution kernel, is output. filter_length: length of each filter. 
First we look at the first convolution layer, the output shape is easy to understand, because there are 64 convolution kernel, so the output is 64, then we look argument: in fact can be understood, we have case (10, 2D convolution signal 32) is equivalent to the convolution 1D is subjected to convolution kernel (filter_length, 32) of the

Well, it sauce

Guess you like

Origin www.cnblogs.com/zb-ml/p/12668950.html