caffe源码分析-InputLayer

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/haluoluo211/article/details/82956418

对于输入层,我们首先分析最简单的InputLayer层,其常作为网络inference时的输入,简单的mnist使用示例如下:

layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param { shape: { dim: 1 dim: 1 dim: 28 dim: 28 } }
}

proto中相关的参数如下:

// Specifies the shape (dimensions) of a Blob.
message BlobShape {
  repeated int64 dim = 1 [packed = true];
}
message InputParameter {
  // This layer produces N >= 1 top blob(s) to be assigned manually.
  // Define N shapes to set a shape for each top.
  // Define 1 shape to set the same shape for every top.
  // Define no shape to defer to reshaping manually.
  repeated BlobShape shape = 1;
}
message LayerParameter {
//.....
optional InputParameter input_param = 143;
}

InputLayer继承自Layer:


template <typename Dtype>
class InputLayer : public Layer<Dtype> {
public:
    explicit InputLayer(const LayerParameter& param)
            : Layer<Dtype>(param) {}
}

InputLayer没有bottom,因此:


// Data layers have no bottoms, so reshaping is trivial.
virtual void Reshape(const vector<Blob<Dtype>*>& bottom,
                     const vector<Blob<Dtype>*>& top) {}

virtual inline const char* type() const { return "Input"; }
virtual inline int ExactNumBottomBlobs() const { return 0; }
virtual inline int MinTopBlobs() const { return 1; }

InputLayer显然无需前向,反向传播,因此对应的函数定义为空:


virtual void Forward_cpu(const vector<Blob<Dtype>*>& bottom,
                         const vector<Blob<Dtype>*>& top) { }
virtual void Backward_cpu(const vector<Blob<Dtype>*>& top,
                          const vector<bool>& propagate_down, const vector<Blob<Dtype>*>& bottom) { }

接下来是LayerSetUp初始化top的shape:


template <typename Dtype>
void InputLayer<Dtype>::LayerSetUp(const vector<Blob<Dtype>*>& bottom,
                                   const vector<Blob<Dtype>*>& top) {
    const int num_top = top.size();
    const InputParameter& param = this->layer_param_.input_param();
    const int num_shape = param.shape_size();
    CHECK(num_shape == 0 || num_shape == 1 || num_shape == num_top)
    << "Must specify 'shape' once, once per top blob, or not at all: "
    << num_top << " tops vs. " << num_shape << " shapes.";
    if (num_shape > 0) {
        for (int i = 0; i < num_top; ++i) {
            const int shape_index = (param.shape_size() == 1) ? 0 : i;
            top[i]->Reshape(param.shape(shape_index));
        }
    }
}

猜你喜欢

转载自blog.csdn.net/haluoluo211/article/details/82956418