学习笔记-CIFAR10模型理解简述

整个结构中包含三个convolution layer、三个pooling layer和两个fully connected layer。

每个层有多个Feature Map,每个Feature Map通过一种卷积滤波器提取输入的一种特征,然后每个Feature Map有多个神经元。

首先是数据层,测试数据100张为一批(batch_size),后面括号内是数据总大小。如100*32*32*3= 307200

 Top shape: 100 3 32 32 (307200)  

 Top shape: 100 1 1 1 (100)  


conv1(即产生图上 C1数据)层是一个卷积层,由32个特征图Feature Map构成。卷积核的大小是5*5,因为有pad为2,也就是每边增加两个单位的边界。 通过卷积之后,数据变成(32+2*2-5+1)*(32+2*2-5+1)

  1. layers {  
  2.   name: "conv1"  
  3.   type: CONVOLUTION  
  4.   bottom: "data"  
  5.   top: "conv1"  
  6.   blobs_lr: 1  
  7.   blobs_lr: 2  
  8.   convolution_param {  
  9.     num_output: 32  
  10.     pad: 2  
  11.     kernel_size: 5  
  12.     stride: 1  
  13.     weight_filler {  
  14.       type: "gaussian"  
  15.       std: 0.0001  
  16.     }  
  17.     bias_filler {  
  18.       type: "constant"  
  19.     }  
  20.   }  
  21. }  

Top shape: 100 32 32 32 (3276800)  


pool1 是一个降采样层,有32个16*16的特征图。降采样的核是2*2的,所以数据变成16*16.

Top shape: 100 32 16 16 (819200)  


然后接入RELU1

Top shape: 100 32 16 16 (819200)  


conv2 是卷积层,核还是5*5,pad还是2。

Top shape: 100 32 16 16 (819200)   

然后接入RELU2

Top shape: 100 32 16 16 (819200)   


pool2是降采样层,降采样核为2*2,则数据变成8*8

Top shape: 100 32 8 8 (204800)  


conv3 是卷积层,核还是5*5,pad还是2,特征是64个。

 Top shape: 100 64 8 8 (409600)  

然后接入RELU2

Top shape: 100 64 8 8 (409600)

pool3是降采样层,降采样核为2*2,则数据变成4*4

Top shape: 100 64 4 4 (102400)  


ip1 是全连接层。某个程度上可以认为是卷积层。输出为64. 原始模型中,从5*5的数据通过5*5的卷积得到1*1的数据。 现在的模型数据为4*4,得到的数据也是1*1,构成了数据中的全连接。

Top shape: 100 64 1 1 (6400)  


ip2是第二个全连接层,输出为10,直接输出结果,数据的分类判断在这一层中完成。

Top shape: 100 64 8 8 (409600)
I0313 00:40:24.570014 13825 net.cpp:96] Setting up ip2  
I0313 00:40:24.570108 13825 net.cpp:103] Top shape: 100 10 1 1 (1000)  
I0313 00:40:24.570114 13825 net.cpp:113] Memory required for data: 31979600  
I0313 00:40:24.570127 13825 net.cpp:67] Creating Layer ip2_ip2_0_split  
I0313 00:40:24.570134 13825 net.cpp:394] ip2_ip2_0_split <- ip2  
I0313 00:40:24.570143 13825 net.cpp:356] ip2_ip2_0_split -> ip2_ip2_0_split_0  
I0313 00:40:24.570154 13825 net.cpp:356] ip2_ip2_0_split -> ip2_ip2_0_split_1  
I0313 00:40:24.570163 13825 net.cpp:96] Setting up ip2_ip2_0_split  
I0313 00:40:24.570171 13825 net.cpp:103] Top shape: 100 10 1 1 (1000)  
I0313 00:40:24.570176 13825 net.cpp:103] Top shape: 100 10 1 1 (1000)  
I0313 00:40:24.570181 13825 net.cpp:113] Memory required for data: 31987600  



输入猫的图片 



输出结果为:

['deer' 'airplane' 'cat' 'frog' 'bird']


I0313 00:40:24.471560 13825 net.cpp:67] Creating Layer cifar  
I0313 00:40:24.471570 13825 net.cpp:356] cifar -> data  
I0313 00:40:24.471585 13825 net.cpp:356] cifar -> label  
I0313 00:40:24.471596 13825 net.cpp:96] Setting up cifar  
I0313 00:40:24.471602 13825 data_layer.cpp:45] Opening leveldb examples/cifar10/cifar10_test_leveldb  
I0313 00:40:24.549324 13825 data_layer.cpp:128] output data size: 100,3,32,32  
I0313 00:40:24.549372 13825 base_data_layer.cpp:36] Loading mean file fromexamples/cifar10/mean.binaryproto  
I0313 00:40:24.550582 13825 base_data_layer.cpp:64] Initializing prefetch  
I0313 00:40:24.550639 13825 base_data_layer.cpp:66] Prefetch initialized.  
I0313 00:40:24.550683 13825 net.cpp:103] Top shape: 100 3 32 32 (307200)  
I0313 00:40:24.550698 13825 net.cpp:103] Top shape: 100 1 1 1 (100)  
I0313 00:40:24.550709 13825 net.cpp:113] Memory required for data: 1229200  
I0313 00:40:24.550734 13825 net.cpp:67] Creating Layer label_cifar_1_split  
I0313 00:40:24.550750 13825 net.cpp:394] label_cifar_1_split <- label  
I0313 00:40:24.550775 13825 net.cpp:356] label_cifar_1_split -> label_cifar_1_split_0  
I0313 00:40:24.550802 13825 net.cpp:356] label_cifar_1_split -> label_cifar_1_split_1  
I0313 00:40:24.550824 13825 net.cpp:96] Setting up label_cifar_1_split  
I0313 00:40:24.550843 13825 net.cpp:103] Top shape: 100 1 1 1 (100)  
I0313 00:40:24.550855 13825 net.cpp:103] Top shape: 100 1 1 1 (100)  
I0313 00:40:24.550866 13825 net.cpp:113] Memory required for data: 1230000  
I0313 00:40:24.550889 13825 net.cpp:67] Creating Layer conv1  
I0313 00:40:24.550902 13825 net.cpp:394] conv1 <- data  
I0313 00:40:24.550926 13825 net.cpp:356] conv1 -> conv1  
I0313 00:40:24.550951 13825 net.cpp:96] Setting up conv1  
I0313 00:40:24.551573 13825 net.cpp:103] Top shape: 100 32 32 32 (3276800)  
I0313 00:40:24.551583 13825 net.cpp:113] Memory required for data: 14337200  
I0313 00:40:24.551599 13825 net.cpp:67] Creating Layer pool1  
I0313 00:40:24.551605 13825 net.cpp:394] pool1 <- conv1  
I0313 00:40:24.551615 13825 net.cpp:356] pool1 -> pool1  
I0313 00:40:24.551625 13825 net.cpp:96] Setting up pool1  
I0313 00:40:24.551633 13825 net.cpp:103] Top shape: 100 32 16 16 (819200)  
I0313 00:40:24.551638 13825 net.cpp:113] Memory required for data: 17614000  
I0313 00:40:24.551652 13825 net.cpp:67] Creating Layer relu1  
I0313 00:40:24.551658 13825 net.cpp:394] relu1 <- pool1  
I0313 00:40:24.551667 13825 net.cpp:345] relu1 -> pool1 (in-place)  
I0313 00:40:24.551676 13825 net.cpp:96] Setting up relu1  
I0313 00:40:24.551682 13825 net.cpp:103] Top shape: 100 32 16 16 (819200)  
I0313 00:40:24.551687 13825 net.cpp:113] Memory required for data: 20890800  
I0313 00:40:24.551695 13825 net.cpp:67] Creating Layer conv2  
I0313 00:40:24.551700 13825 net.cpp:394] conv2 <- pool1  
I0313 00:40:24.551710 13825 net.cpp:356] conv2 -> conv2  
I0313 00:40:24.551720 13825 net.cpp:96] Setting up conv2  
I0313 00:40:24.554986 13825 net.cpp:103] Top shape: 100 32 16 16 (819200)  
I0313 00:40:24.554996 13825 net.cpp:113] Memory required for data: 24167600  
I0313 00:40:24.555009 13825 net.cpp:67] Creating Layer relu2  
I0313 00:40:24.555024 13825 net.cpp:394] relu2 <- conv2  
I0313 00:40:24.555034 13825 net.cpp:345] relu2 -> conv2 (in-place)  
I0313 00:40:24.555043 13825 net.cpp:96] Setting up relu2  
I0313 00:40:24.555049 13825 net.cpp:103] Top shape: 100 32 16 16 (819200)  
I0313 00:40:24.555054 13825 net.cpp:113] Memory required for data: 27444400  
I0313 00:40:24.555061 13825 net.cpp:67] Creating Layer pool2  
I0313 00:40:24.555068 13825 net.cpp:394] pool2 <- conv2  
I0313 00:40:24.555076 13825 net.cpp:356] pool2 -> pool2  
I0313 00:40:24.555085 13825 net.cpp:96] Setting up pool2  
I0313 00:40:24.555094 13825 net.cpp:103] Top shape: 100 32 8 8 (204800)  
I0313 00:40:24.555099 13825 net.cpp:113] Memory required for data: 28263600  
I0313 00:40:24.555109 13825 net.cpp:67] Creating Layer conv3  
I0313 00:40:24.555114 13825 net.cpp:394] conv3 <- pool2  
I0313 00:40:24.555124 13825 net.cpp:356] conv3 -> conv3  
I0313 00:40:24.555135 13825 net.cpp:96] Setting up conv3  
I0313 00:40:24.561589 13825 net.cpp:103] Top shape: 100 64 8 8 (409600)  
I0313 00:40:24.561599 13825 net.cpp:113] Memory required for data: 29902000  
I0313 00:40:24.561611 13825 net.cpp:67] Creating Layer relu3  
I0313 00:40:24.561619 13825 net.cpp:394] relu3 <- conv3  
I0313 00:40:24.561627 13825 net.cpp:345] relu3 -> conv3 (in-place)  
I0313 00:40:24.561636 13825 net.cpp:96] Setting up relu3  
I0313 00:40:24.561642 13825 net.cpp:103] Top shape: 100 64 8 8 (409600)  
I0313 00:40:24.561646 13825 net.cpp:113] Memory required for data: 31540400  
I0313 00:40:24.561655 13825 net.cpp:67] Creating Layer pool3  
I0313 00:40:24.561661 13825 net.cpp:394] pool3 <- conv3  
I0313 00:40:24.561669 13825 net.cpp:356] pool3 -> pool3  
I0313 00:40:24.561678 13825 net.cpp:96] Setting up pool3  
I0313 00:40:24.561686 13825 net.cpp:103] Top shape: 100 64 4 4 (102400)  
I0313 00:40:24.561691 13825 net.cpp:113] Memory required for data: 31950000  
I0313 00:40:24.561699 13825 net.cpp:67] Creating Layer ip1  
I0313 00:40:24.561704 13825 net.cpp:394] ip1 <- pool3  
I0313 00:40:24.561714 13825 net.cpp:356] ip1 -> ip1  
I0313 00:40:24.561724 13825 net.cpp:96] Setting up ip1  
I0313 00:40:24.569967 13825 net.cpp:103] Top shape: 100 64 1 1 (6400)  
I0313 00:40:24.569975 13825 net.cpp:113] Memory required for data: 31975600  
I0313 00:40:24.569988 13825 net.cpp:67] Creating Layer ip2  
I0313 00:40:24.569993 13825 net.cpp:394] ip2 <- ip1  
I0313 00:40:24.570004 13825 net.cpp:356] ip2 -> ip2  
I0313 00:40:24.570014 13825 net.cpp:96] Setting up ip2  
I0313 00:40:24.570108 13825 net.cpp:103] Top shape: 100 10 1 1 (1000)  
I0313 00:40:24.570114 13825 net.cpp:113] Memory required for data: 31979600  
I0313 00:40:24.570127 13825 net.cpp:67] Creating Layer ip2_ip2_0_split  
I0313 00:40:24.570134 13825 net.cpp:394] ip2_ip2_0_split <- ip2  
I0313 00:40:24.570143 13825 net.cpp:356] ip2_ip2_0_split -> ip2_ip2_0_split_0  
I0313 00:40:24.570154 13825 net.cpp:356] ip2_ip2_0_split -> ip2_ip2_0_split_1  
I0313 00:40:24.570163 13825 net.cpp:96] Setting up ip2_ip2_0_split  
I0313 00:40:24.570171 13825 net.cpp:103] Top shape: 100 10 1 1 (1000)  
I0313 00:40:24.570176 13825 net.cpp:103] Top shape: 100 10 1 1 (1000)  
I0313 00:40:24.570181 13825 net.cpp:113] Memory required for data: 31987600  
I0313 00:40:24.570189 13825 net.cpp:67] Creating Layer accuracy  
I0313 00:40:24.570194 13825 net.cpp:394] accuracy <- ip2_ip2_0_split_0  
I0313 00:40:24.570202 13825 net.cpp:394] accuracy <- label_cifar_1_split_0  
I0313 00:40:24.570214 13825 net.cpp:356] accuracy -> accuracy  
I0313 00:40:24.570222 13825 net.cpp:96] Setting up accuracy  
I0313 00:40:24.570230 13825 net.cpp:103] Top shape: 1 1 1 1 (1)  
I0313 00:40:24.570235 13825 net.cpp:113] Memory required for data: 31987604  
I0313 00:40:24.570245 13825 net.cpp:67] Creating Layer loss  
I0313 00:40:24.570250 13825 net.cpp:394] loss <- ip2_ip2_0_split_1  
I0313 00:40:24.570257 13825 net.cpp:394] loss <- label_cifar_1_split_1  
I0313 00:40:24.570266 13825 net.cpp:356] loss -> loss  
I0313 00:40:24.570274 13825 net.cpp:96] Setting up loss  
I0313 00:40:24.570286 13825 net.cpp:103] Top shape: 1 1 1 1 (1)  
I0313 00:40:24.570291 13825 net.cpp:109]     with loss weight 1  
I0313 00:40:24.570305 13825 net.cpp:113] Memory required for data: 31987608  
I0313 00:40:24.570312 13825 net.cpp:170] loss needs backward computation.  
I0313 00:40:24.570317 13825 net.cpp:172] accuracy does not need backward computation.  
I0313 00:40:24.570322 13825 net.cpp:170] ip2_ip2_0_split needs backward computation.  
I0313 00:40:24.570338 13825 net.cpp:170] ip2 needs backward computation.  
I0313 00:40:24.570349 13825 net.cpp:170] ip1 needs backward computation.  
I0313 00:40:24.570359 13825 net.cpp:170] pool3 needs backward computation.  
I0313 00:40:24.570372 13825 net.cpp:170] relu3 needs backward computation.  
I0313 00:40:24.570384 13825 net.cpp:170] conv3 needs backward computation.  
I0313 00:40:24.570396 13825 net.cpp:170] pool2 needs backward computation.  
I0313 00:40:24.570406 13825 net.cpp:170] relu2 needs backward computation.  
I0313 00:40:24.570420 13825 net.cpp:170] conv2 needs backward computation.  
I0313 00:40:24.570432 13825 net.cpp:170] relu1 needs backward computation.  
I0313 00:40:24.570442 13825 net.cpp:170] pool1 needs backward computation.  
I0313 00:40:24.570456 13825 net.cpp:170] conv1 needs backward computation.  
I0313 00:40:24.570471 13825 net.cpp:172] label_cifar_1_split does not need backward computation.  
I0313 00:40:24.570482 13825 net.cpp:172] cifar does not need backward computation.  
I0313 00:40:24.570494 13825 net.cpp:208] This network produces output accuracy  
I0313 00:40:24.570505 13825 net.cpp:208] This network produces output loss  
I0313 00:40:24.570536 13825 net.cpp:467] Collecting Learning Rate and Weight Decay.  
I0313 00:40:24.570549 13825 net.cpp:219] Network initialization done.  
I0313 00:40:24.570554 13825 net.cpp:220] Memory required for data: 31987608  
I0313 00:40:24.570590 13825 solver.cpp:41] Solver scaffolding done.  
I0313 00:40:24.570595 13825 solver.cpp:160] Solving CIFAR10_quick  
I0313 00:40:24.570600 13825 solver.cpp:161] Learning Rate Policy: fixed  


Top shape: 100 64 8 8 (409600)

额外补充:
如果调用CIFAR10的模型,会遇到要将binaryproto转成.npy格式的情况,需要调用python io模块里面的blobproto_to_array,能够生成npy,不过有的时候出现axes don't match array, 则需要修改这个函数, 可参见https://github.com/BVLC/caffe/issues/420

猜你喜欢

转载自blog.csdn.net/lynnandwei/article/details/44302175