1. Custom Layer
For simple, stateless custom operations, you might be able layers.core.Lambda
to implement it through layers. But for custom layers that contain trainable weights, you should implement the layer yourself.
This is a Keras 2.0 , Keras layer skeleton (if you are using an old version, please update to the new version). You only need to implement three methods:
build(input_shape)
: This is where you define the weights. This method must be setself.built = True
and can be done by callingsuper([Layer], self).build()
.call(x)
: This is where the functional logic of the layer is written. You only need to focus on the first parameter passedcall
in: the input tensor, unless you want your layer to support masking.compute_output_shape(input_shape)
: If your layer changes the shape of the input tensor, you should define the shape change logic here, which allows Keras to automatically infer the shape of each layer.
from keras import backend as K
from keras.engine.topology import Layer
class MyLayer(Layer):
def __init__(self, output_dim, **kwargs):
self.output_dim = output_dim
super(MyLayer, self).__init__(**kwargs)
def build(self, input_shape):
# 为该层创建一个可训练的权重
self.kernel = self.add_weight(name='kernel',
shape=(input_shape[1], self.output_dim),
initializer='uniform',
trainable=True)
super(MyLayer, self).build(input_shape) # 一定要在最后调用它
def call(self, x):
return K.dot(x, self.kernel)
def compute_output_shape(self, input_shape):
return (input_shape[0], self.output_dim)
复制代码
It is also possible to define Keras layers with multiple input tensors and multiple output tensors. For this, you should assume that both the input and output of the method build(input_shape)
, call(x)
and , are lists. compute_output_shape(input_shape)
Here is an example, similar to the one above:
from keras import backend as K
from keras.engine.topology import Layer
class MyLayer(Layer):
def __init__(self, output_dim, **kwargs):
self.output_dim = output_dim
super(MyLayer, self).__init__(**kwargs)
def build(self, input_shape):
assert isinstance(input_shape, list)
# 为该层创建一个可训练的权重
self.kernel = self.add_weight(name='kernel',
shape=(input_shape[0][1], self.output_dim),
initializer='uniform',
trainable=True)
super(MyLayer, self).build(input_shape) # 一定要在最后调用它
def call(self, x):
assert isinstance(x, list)
a, b = x
return [K.dot(a, self.kernel) + b, K.mean(b, axis=-1)]
def compute_output_shape(self, input_shape):
assert isinstance(input_shape, list)
shape_a, shape_b = input_shape
return [(shape_a[0], self.output_dim), shape_b[:-1]]
复制代码
An existing Keras layer is a good example of implementing any layer. Don't hesitate to read the source code!
2. Custom evaluation function
Custom evaluation functions should be passed in at compile time. This function takes (y_true, y_pred)
as input parameter and returns a tensor as output.
import keras.backend as K
def mean_pred(y_true, y_pred):
return K.mean(y_pred)
model.compile(optimizer='rmsprop',
loss='binary_crossentropy',
metrics=['accuracy', mean_pred])
复制代码
3. Custom loss function
Custom loss functions should also be passed in at compile time. This function takes (y_true, y_pred)
as input parameter and returns a tensor as output.
import keras.backend as K
def my_loss(y_true, y_pred):
return K.mean(K.squre(y_pred-y_true))#以平方差举例
model.compile(optimizer='rmsprop',
loss=my_loss,
metrics=['accuracy'])
复制代码
4. Handle custom layers (or other custom objects) in saved models
If the model to be loaded contains custom layers or other custom classes or functions, they can be custom_objects
passed to the loading mechanism via parameters:
from keras.models import load_model
# 假设你的模型包含一个 AttentionLayer 类的实例
model = load_model('my_model.h5', custom_objects={'AttentionLayer': AttentionLayer})
复制代码
或者,你可以使用 自定义对象作用域:
from keras.utils import CustomObjectScope
with CustomObjectScope({'AttentionLayer': AttentionLayer}):
model = load_model('my_model.h5')
复制代码