唐宇迪深度学习框架Caffe系列-11

deploy.prototxt

这个文件和训练文件 .prototxt 很相似,但是他是在模型生成后,测试模型使用的配置文件

内容上,除了数据层,其他层都是一模一样的

caffe提供示例的地址:/home/apple/caffe/models/bvlc_reference_caffenet/deploy.prototxt

知识点:

在图像训练阶段如果进行了减均值的操作,那么使用模型的时候,对测试数据也要进行减均值操作

测试模型时,我们可以使用自己数据集的均值文件,可以使用caffe提供的均值文件(为什么使用caffe的均值文件也可以?因为caffe1的均值文件是在超大数据集上做的均值,很具有普遍性,可以近似拿来用)

训练做了哪些预处理

测试也要做那些预处理

cv2等操作读入的是 H*W*C

扫描二维码关注公众号,回复: 4699183 查看本文章

caffe读入的是 batch * C * H * W

import numpy as np
import sys
caffe_root = '/home/apple/caffe/'  # this file should be run from {caffe_root}/examples (otherwise change this line)
sys.path.insert(0, caffe_root + 'python')

import caffe

caffe.set_mode_cpu()

model_def = '/home/apple/caffe/models/bvlc_reference_caffenet/deploy.prototxt'
model_weights = '/home/apple/caffe_case/bvlc_reference_caffenet.caffemodel'

net = caffe.Net(model_def,      # defines the structure of the model
                model_weights,  # contains the trained weights
                caffe.TEST)     # use test mode (e.g., don't perform dropout)
mu = np.load(caffe_root + 'python/caffe/imagenet/ilsvrc_2012_mean.npy')
mu = mu.mean(1).mean(1)  # average over pixels to obtain the mean (BGR) pixel values
print 'mean-subtracted values:', zip('BGR', mu)

# create transformer for the input called 'data'
transformer = caffe.io.Transformer({'data': net.blobs['data'].data.shape})

transformer.set_transpose('data', (2,0,1))  # move image channels to outermost dimension
transformer.set_mean('data', mu)            # subtract the dataset-mean value in each channel
transformer.set_raw_scale('data', 255)      # rescale from [0, 1] to [0, 255]
transformer.set_channel_swap('data', (2,1,0))  # swap channels from RGB to BGR

net.blobs['data'].reshape(1,        # batch size
                          3,         # 3-channel (BGR) images
                          227, 227)  # image size is 227x227

image = caffe.io.load_image(caffe_root + 'examples/images/cat.jpg')
transformed_image = transformer.preprocess('data', image)

# copy the image data into the memory allocated for the net
net.blobs['data'].data[...] = transformed_image

### perform classification
output = net.forward()

output_prob = output['prob'][0]  # the output probability vector for the first image in the batch

print 'predicted class is:', output_prob.argmax()

运行结果:

apple@apple:~/caffe_case$ python test_model.py 
WARNING: Logging before InitGoogleLogging() is written to STDERR
W1216 19:10:03.724741  7088 _caffe.cpp:139] DEPRECATION WARNING - deprecated use of Python interface
W1216 19:10:03.725069  7088 _caffe.cpp:140] Use this instead (with the named "weights" parameter):
W1216 19:10:03.725086  7088 _caffe.cpp:142] Net('/home/apple/caffe/models/bvlc_reference_caffenet/deploy.prototxt', 1, weights='/home/apple/caffe_case/bvlc_reference_caffenet.caffemodel')
I1216 19:10:03.767719  7088 net.cpp:53] Initializing net from parameters: 
name: "CaffeNet"
state {
  phase: TEST
  level: 0
}
layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param {
    shape {
      dim: 10
      dim: 3
      dim: 227
      dim: 227
    }
  }
}
layer {
  name: "conv1"
  type: "Convolution"
  bottom: "data"
  top: "conv1"
  convolution_param {
    num_output: 96
    kernel_size: 11
    stride: 4
  }
}
layer {
  name: "relu1"
  type: "ReLU"
  bottom: "conv1"
  top: "conv1"
}
layer {
  name: "pool1"
  type: "Pooling"
  bottom: "conv1"
  top: "pool1"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm1"
  type: "LRN"
  bottom: "pool1"
  top: "norm1"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv2"
  type: "Convolution"
  bottom: "norm1"
  top: "conv2"
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
  }
}
layer {
  name: "relu2"
  type: "ReLU"
  bottom: "conv2"
  top: "conv2"
}
layer {
  name: "pool2"
  type: "Pooling"
  bottom: "conv2"
  top: "pool2"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "norm2"
  type: "LRN"
  bottom: "pool2"
  top: "norm2"
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layer {
  name: "conv3"
  type: "Convolution"
  bottom: "norm2"
  top: "conv3"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
  }
}
layer {
  name: "relu3"
  type: "ReLU"
  bottom: "conv3"
  top: "conv3"
}
layer {
  name: "conv4"
  type: "Convolution"
  bottom: "conv3"
  top: "conv4"
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu4"
  type: "ReLU"
  bottom: "conv4"
  top: "conv4"
}
layer {
  name: "conv5"
  type: "Convolution"
  bottom: "conv4"
  top: "conv5"
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
  }
}
layer {
  name: "relu5"
  type: "ReLU"
  bottom: "conv5"
  top: "conv5"
}
layer {
  name: "pool5"
  type: "Pooling"
  bottom: "conv5"
  top: "pool5"
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layer {
  name: "fc6"
  type: "InnerProduct"
  bottom: "pool5"
  top: "fc6"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu6"
  type: "ReLU"
  bottom: "fc6"
  top: "fc6"
}
layer {
  name: "drop6"
  type: "Dropout"
  bottom: "fc6"
  top: "fc6"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc7"
  type: "InnerProduct"
  bottom: "fc6"
  top: "fc7"
  inner_product_param {
    num_output: 4096
  }
}
layer {
  name: "relu7"
  type: "ReLU"
  bottom: "fc7"
  top: "fc7"
}
layer {
  name: "drop7"
  type: "Dropout"
  bottom: "fc7"
  top: "fc7"
  dropout_param {
    dropout_ratio: 0.5
  }
}
layer {
  name: "fc8"
  type: "InnerProduct"
  bottom: "fc7"
  top: "fc8"
  inner_product_param {
    num_output: 1000
  }
}
layer {
  name: "prob"
  type: "Softmax"
  bottom: "fc8"
  top: "prob"
}
I1216 19:10:03.767989  7088 layer_factory.hpp:77] Creating layer data
I1216 19:10:03.768115  7088 net.cpp:86] Creating Layer data
I1216 19:10:03.768229  7088 net.cpp:382] data -> data
I1216 19:10:03.768355  7088 net.cpp:124] Setting up data
I1216 19:10:03.768426  7088 net.cpp:131] Top shape: 10 3 227 227 (1545870)
I1216 19:10:03.768441  7088 net.cpp:139] Memory required for data: 6183480
I1216 19:10:03.768450  7088 layer_factory.hpp:77] Creating layer conv1
I1216 19:10:03.768465  7088 net.cpp:86] Creating Layer conv1
I1216 19:10:03.768476  7088 net.cpp:408] conv1 <- data
I1216 19:10:03.768486  7088 net.cpp:382] conv1 -> conv1
I1216 19:10:03.769486  7088 net.cpp:124] Setting up conv1
I1216 19:10:03.769539  7088 net.cpp:131] Top shape: 10 96 55 55 (2904000)
I1216 19:10:03.769547  7088 net.cpp:139] Memory required for data: 17799480
I1216 19:10:03.769565  7088 layer_factory.hpp:77] Creating layer relu1
I1216 19:10:03.769577  7088 net.cpp:86] Creating Layer relu1
I1216 19:10:03.769584  7088 net.cpp:408] relu1 <- conv1
I1216 19:10:03.769596  7088 net.cpp:369] relu1 -> conv1 (in-place)
I1216 19:10:03.769608  7088 net.cpp:124] Setting up relu1
I1216 19:10:03.769615  7088 net.cpp:131] Top shape: 10 96 55 55 (2904000)
I1216 19:10:03.769619  7088 net.cpp:139] Memory required for data: 29415480
I1216 19:10:03.769624  7088 layer_factory.hpp:77] Creating layer pool1
I1216 19:10:03.769632  7088 net.cpp:86] Creating Layer pool1
I1216 19:10:03.769637  7088 net.cpp:408] pool1 <- conv1
I1216 19:10:03.769644  7088 net.cpp:382] pool1 -> pool1
I1216 19:10:03.769721  7088 net.cpp:124] Setting up pool1
I1216 19:10:03.769767  7088 net.cpp:131] Top shape: 10 96 27 27 (699840)
I1216 19:10:03.769774  7088 net.cpp:139] Memory required for data: 32214840
I1216 19:10:03.769780  7088 layer_factory.hpp:77] Creating layer norm1
I1216 19:10:03.769799  7088 net.cpp:86] Creating Layer norm1
I1216 19:10:03.769806  7088 net.cpp:408] norm1 <- pool1
I1216 19:10:03.769815  7088 net.cpp:382] norm1 -> norm1
I1216 19:10:03.769888  7088 net.cpp:124] Setting up norm1
I1216 19:10:03.769904  7088 net.cpp:131] Top shape: 10 96 27 27 (699840)
I1216 19:10:03.769909  7088 net.cpp:139] Memory required for data: 35014200
I1216 19:10:03.769914  7088 layer_factory.hpp:77] Creating layer conv2
I1216 19:10:03.769927  7088 net.cpp:86] Creating Layer conv2
I1216 19:10:03.769932  7088 net.cpp:408] conv2 <- norm1
I1216 19:10:03.769942  7088 net.cpp:382] conv2 -> conv2
I1216 19:10:03.884491  7088 net.cpp:124] Setting up conv2
I1216 19:10:03.884582  7088 net.cpp:131] Top shape: 10 256 27 27 (1866240)
I1216 19:10:03.884591  7088 net.cpp:139] Memory required for data: 42479160
I1216 19:10:03.884613  7088 layer_factory.hpp:77] Creating layer relu2
I1216 19:10:03.884627  7088 net.cpp:86] Creating Layer relu2
I1216 19:10:03.884634  7088 net.cpp:408] relu2 <- conv2
I1216 19:10:03.884649  7088 net.cpp:369] relu2 -> conv2 (in-place)
I1216 19:10:03.884701  7088 net.cpp:124] Setting up relu2
I1216 19:10:03.884718  7088 net.cpp:131] Top shape: 10 256 27 27 (1866240)
I1216 19:10:03.884723  7088 net.cpp:139] Memory required for data: 49944120
I1216 19:10:03.884728  7088 layer_factory.hpp:77] Creating layer pool2
I1216 19:10:03.884739  7088 net.cpp:86] Creating Layer pool2
I1216 19:10:03.884744  7088 net.cpp:408] pool2 <- conv2
I1216 19:10:03.884753  7088 net.cpp:382] pool2 -> pool2
I1216 19:10:03.884773  7088 net.cpp:124] Setting up pool2
I1216 19:10:03.884835  7088 net.cpp:131] Top shape: 10 256 13 13 (432640)
I1216 19:10:03.884845  7088 net.cpp:139] Memory required for data: 51674680
I1216 19:10:03.884855  7088 layer_factory.hpp:77] Creating layer norm2
I1216 19:10:03.884877  7088 net.cpp:86] Creating Layer norm2
I1216 19:10:03.884889  7088 net.cpp:408] norm2 <- pool2
I1216 19:10:03.884898  7088 net.cpp:382] norm2 -> norm2
I1216 19:10:03.884912  7088 net.cpp:124] Setting up norm2
I1216 19:10:03.884953  7088 net.cpp:131] Top shape: 10 256 13 13 (432640)
I1216 19:10:03.884960  7088 net.cpp:139] Memory required for data: 53405240
I1216 19:10:03.884995  7088 layer_factory.hpp:77] Creating layer conv3
I1216 19:10:03.885018  7088 net.cpp:86] Creating Layer conv3
I1216 19:10:03.885025  7088 net.cpp:408] conv3 <- norm2
I1216 19:10:03.885103  7088 net.cpp:382] conv3 -> conv3
I1216 19:10:04.065801  7088 net.cpp:124] Setting up conv3
I1216 19:10:04.065944  7088 net.cpp:131] Top shape: 10 384 13 13 (648960)
I1216 19:10:04.065963  7088 net.cpp:139] Memory required for data: 56001080
I1216 19:10:04.065999  7088 layer_factory.hpp:77] Creating layer relu3
I1216 19:10:04.066082  7088 net.cpp:86] Creating Layer relu3
I1216 19:10:04.066152  7088 net.cpp:408] relu3 <- conv3
I1216 19:10:04.066187  7088 net.cpp:369] relu3 -> conv3 (in-place)
I1216 19:10:04.066215  7088 net.cpp:124] Setting up relu3
I1216 19:10:04.066319  7088 net.cpp:131] Top shape: 10 384 13 13 (648960)
I1216 19:10:04.066345  7088 net.cpp:139] Memory required for data: 58596920
I1216 19:10:04.066365  7088 layer_factory.hpp:77] Creating layer conv4
I1216 19:10:04.066390  7088 net.cpp:86] Creating Layer conv4
I1216 19:10:04.066459  7088 net.cpp:408] conv4 <- conv3
I1216 19:10:04.066601  7088 net.cpp:382] conv4 -> conv4
I1216 19:10:04.228473  7088 net.cpp:124] Setting up conv4
I1216 19:10:04.228603  7088 net.cpp:131] Top shape: 10 384 13 13 (648960)
I1216 19:10:04.228621  7088 net.cpp:139] Memory required for data: 61192760
I1216 19:10:04.228643  7088 layer_factory.hpp:77] Creating layer relu4
I1216 19:10:04.228667  7088 net.cpp:86] Creating Layer relu4
I1216 19:10:04.228679  7088 net.cpp:408] relu4 <- conv4
I1216 19:10:04.228781  7088 net.cpp:369] relu4 -> conv4 (in-place)
I1216 19:10:04.228868  7088 net.cpp:124] Setting up relu4
I1216 19:10:04.228936  7088 net.cpp:131] Top shape: 10 384 13 13 (648960)
I1216 19:10:04.229012  7088 net.cpp:139] Memory required for data: 63788600
I1216 19:10:04.229075  7088 layer_factory.hpp:77] Creating layer conv5
I1216 19:10:04.229113  7088 net.cpp:86] Creating Layer conv5
I1216 19:10:04.229135  7088 net.cpp:408] conv5 <- conv4
I1216 19:10:04.229151  7088 net.cpp:382] conv5 -> conv5
I1216 19:10:04.337782  7088 net.cpp:124] Setting up conv5
I1216 19:10:04.337918  7088 net.cpp:131] Top shape: 10 256 13 13 (432640)
I1216 19:10:04.337935  7088 net.cpp:139] Memory required for data: 65519160
I1216 19:10:04.337975  7088 layer_factory.hpp:77] Creating layer relu5
I1216 19:10:04.338062  7088 net.cpp:86] Creating Layer relu5
I1216 19:10:04.338130  7088 net.cpp:408] relu5 <- conv5
I1216 19:10:04.338157  7088 net.cpp:369] relu5 -> conv5 (in-place)
I1216 19:10:04.338183  7088 net.cpp:124] Setting up relu5
I1216 19:10:04.338197  7088 net.cpp:131] Top shape: 10 256 13 13 (432640)
I1216 19:10:04.338207  7088 net.cpp:139] Memory required for data: 67249720
I1216 19:10:04.338215  7088 layer_factory.hpp:77] Creating layer pool5
I1216 19:10:04.338232  7088 net.cpp:86] Creating Layer pool5
I1216 19:10:04.338241  7088 net.cpp:408] pool5 <- conv5
I1216 19:10:04.338266  7088 net.cpp:382] pool5 -> pool5
I1216 19:10:04.338296  7088 net.cpp:124] Setting up pool5
I1216 19:10:04.338310  7088 net.cpp:131] Top shape: 10 256 6 6 (92160)
I1216 19:10:04.338320  7088 net.cpp:139] Memory required for data: 67618360
I1216 19:10:04.338327  7088 layer_factory.hpp:77] Creating layer fc6
I1216 19:10:04.338404  7088 net.cpp:86] Creating Layer fc6
I1216 19:10:04.338421  7088 net.cpp:408] fc6 <- pool5
I1216 19:10:04.338438  7088 net.cpp:382] fc6 -> fc6
I1216 19:10:10.817589  7088 net.cpp:124] Setting up fc6
I1216 19:10:10.817629  7088 net.cpp:131] Top shape: 10 4096 (40960)
I1216 19:10:10.817632  7088 net.cpp:139] Memory required for data: 67782200
I1216 19:10:10.817641  7088 layer_factory.hpp:77] Creating layer relu6
I1216 19:10:10.817648  7088 net.cpp:86] Creating Layer relu6
I1216 19:10:10.817651  7088 net.cpp:408] relu6 <- fc6
I1216 19:10:10.817656  7088 net.cpp:369] relu6 -> fc6 (in-place)
I1216 19:10:10.817662  7088 net.cpp:124] Setting up relu6
I1216 19:10:10.817677  7088 net.cpp:131] Top shape: 10 4096 (40960)
I1216 19:10:10.817690  7088 net.cpp:139] Memory required for data: 67946040
I1216 19:10:10.817693  7088 layer_factory.hpp:77] Creating layer drop6
I1216 19:10:10.817706  7088 net.cpp:86] Creating Layer drop6
I1216 19:10:10.817709  7088 net.cpp:408] drop6 <- fc6
I1216 19:10:10.817713  7088 net.cpp:369] drop6 -> fc6 (in-place)
I1216 19:10:10.817718  7088 net.cpp:124] Setting up drop6
I1216 19:10:10.817731  7088 net.cpp:131] Top shape: 10 4096 (40960)
I1216 19:10:10.817734  7088 net.cpp:139] Memory required for data: 68109880
I1216 19:10:10.817736  7088 layer_factory.hpp:77] Creating layer fc7
I1216 19:10:10.817744  7088 net.cpp:86] Creating Layer fc7
I1216 19:10:10.817745  7088 net.cpp:408] fc7 <- fc6
I1216 19:10:10.817772  7088 net.cpp:382] fc7 -> fc7
I1216 19:10:12.145171  7088 net.cpp:124] Setting up fc7
I1216 19:10:12.145304  7088 net.cpp:131] Top shape: 10 4096 (40960)
I1216 19:10:12.145321  7088 net.cpp:139] Memory required for data: 68273720
I1216 19:10:12.145349  7088 layer_factory.hpp:77] Creating layer relu7
I1216 19:10:12.145380  7088 net.cpp:86] Creating Layer relu7
I1216 19:10:12.145444  7088 net.cpp:408] relu7 <- fc7
I1216 19:10:12.145470  7088 net.cpp:369] relu7 -> fc7 (in-place)
I1216 19:10:12.145493  7088 net.cpp:124] Setting up relu7
I1216 19:10:12.145506  7088 net.cpp:131] Top shape: 10 4096 (40960)
I1216 19:10:12.145514  7088 net.cpp:139] Memory required for data: 68437560
I1216 19:10:12.145521  7088 layer_factory.hpp:77] Creating layer drop7
I1216 19:10:12.145536  7088 net.cpp:86] Creating Layer drop7
I1216 19:10:12.145545  7088 net.cpp:408] drop7 <- fc7
I1216 19:10:12.145668  7088 net.cpp:369] drop7 -> fc7 (in-place)
I1216 19:10:12.145766  7088 net.cpp:124] Setting up drop7
I1216 19:10:12.145831  7088 net.cpp:131] Top shape: 10 4096 (40960)
I1216 19:10:12.145843  7088 net.cpp:139] Memory required for data: 68601400
I1216 19:10:12.145853  7088 layer_factory.hpp:77] Creating layer fc8
I1216 19:10:12.145871  7088 net.cpp:86] Creating Layer fc8
I1216 19:10:12.145880  7088 net.cpp:408] fc8 <- fc7
I1216 19:10:12.145895  7088 net.cpp:382] fc8 -> fc8
I1216 19:10:12.464874  7088 net.cpp:124] Setting up fc8
I1216 19:10:12.464916  7088 net.cpp:131] Top shape: 10 1000 (10000)
I1216 19:10:12.464920  7088 net.cpp:139] Memory required for data: 68641400
I1216 19:10:12.464928  7088 layer_factory.hpp:77] Creating layer prob
I1216 19:10:12.464936  7088 net.cpp:86] Creating Layer prob
I1216 19:10:12.464939  7088 net.cpp:408] prob <- fc8
I1216 19:10:12.464944  7088 net.cpp:382] prob -> prob
I1216 19:10:12.464979  7088 net.cpp:124] Setting up prob
I1216 19:10:12.464996  7088 net.cpp:131] Top shape: 10 1000 (10000)
I1216 19:10:12.465001  7088 net.cpp:139] Memory required for data: 68681400
I1216 19:10:12.465003  7088 net.cpp:202] prob does not need backward computation.
I1216 19:10:12.465006  7088 net.cpp:202] fc8 does not need backward computation.
I1216 19:10:12.465008  7088 net.cpp:202] drop7 does not need backward computation.
I1216 19:10:12.465013  7088 net.cpp:202] relu7 does not need backward computation.
I1216 19:10:12.465014  7088 net.cpp:202] fc7 does not need backward computation.
I1216 19:10:12.465016  7088 net.cpp:202] drop6 does not need backward computation.
I1216 19:10:12.465019  7088 net.cpp:202] relu6 does not need backward computation.
I1216 19:10:12.465023  7088 net.cpp:202] fc6 does not need backward computation.
I1216 19:10:12.465025  7088 net.cpp:202] pool5 does not need backward computation.
I1216 19:10:12.465028  7088 net.cpp:202] relu5 does not need backward computation.
I1216 19:10:12.465041  7088 net.cpp:202] conv5 does not need backward computation.
I1216 19:10:12.465059  7088 net.cpp:202] relu4 does not need backward computation.
I1216 19:10:12.465070  7088 net.cpp:202] conv4 does not need backward computation.
I1216 19:10:12.465072  7088 net.cpp:202] relu3 does not need backward computation.
I1216 19:10:12.465075  7088 net.cpp:202] conv3 does not need backward computation.
I1216 19:10:12.465077  7088 net.cpp:202] norm2 does not need backward computation.
I1216 19:10:12.465080  7088 net.cpp:202] pool2 does not need backward computation.
I1216 19:10:12.465083  7088 net.cpp:202] relu2 does not need backward computation.
I1216 19:10:12.465086  7088 net.cpp:202] conv2 does not need backward computation.
I1216 19:10:12.465088  7088 net.cpp:202] norm1 does not need backward computation.
I1216 19:10:12.465090  7088 net.cpp:202] pool1 does not need backward computation.
I1216 19:10:12.465095  7088 net.cpp:202] relu1 does not need backward computation.
I1216 19:10:12.465096  7088 net.cpp:202] conv1 does not need backward computation.
I1216 19:10:12.465098  7088 net.cpp:202] data does not need backward computation.
I1216 19:10:12.465101  7088 net.cpp:244] This network produces output prob
I1216 19:10:12.465112  7088 net.cpp:257] Network initialization done.
I1216 19:10:18.484563  7088 upgrade_proto.cpp:46] Attempting to upgrade input file specified using deprecated transformation parameters: /home/apple/caffe_case/bvlc_reference_caffenet.caffemodel
I1216 19:10:18.484611  7088 upgrade_proto.cpp:49] Successfully upgraded file specified using deprecated data transformation parameters.
W1216 19:10:18.484616  7088 upgrade_proto.cpp:51] Note that future Caffe releases will only support transform_param messages for transformation fields.
I1216 19:10:18.484617  7088 upgrade_proto.cpp:55] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/apple/caffe_case/bvlc_reference_caffenet.caffemodel
I1216 19:10:56.732518  7088 upgrade_proto.cpp:63] Successfully upgraded file specified using deprecated V1LayerParameter
I1216 19:10:56.795819  7088 net.cpp:746] Ignoring source layer loss
mean-subtracted values: [('B', 104.0069879317889), ('G', 116.66876761696767), ('R', 122.6789143406786)]
/usr/local/lib/python2.7/dist-packages/skimage/io/_io.py:49: UserWarning: `as_grey` has been deprecated in favor of `as_gray`
  warn('`as_grey` has been deprecated in favor of `as_gray`')
/usr/local/lib/python2.7/dist-packages/skimage/transform/_warps.py:110: UserWarning: Anti-aliasing will be enabled by default in skimage 0.15 to avoid aliasing artifacts when down-sampling images.
  warn("Anti-aliasing will be enabled by default in skimage 0.15 to "
predicted class is: 281

猜你喜欢

转载自blog.csdn.net/baidu_40840693/article/details/85028848