我一年前在机器学习中的冒险开始于caffe . 我使用python接口,并使用 caffe/examples/ 目录中提供的ipython笔记本学到很多东西 .

成功运行笔记本01-learning-lenet.ipynb并且能够加载我创建的caffe模型( .caffemodel 文件)后,我冒险将相同的LeNet神经网络应用到我自己的数据中进行字符识别 .

所以我使用了convert_imageset工具,如详细here为火车和测试创建我自己的lmdb文件 . 我的照片是 .png, 60x60 in size, 1 channel . 我遵循与上述笔记本完全相同的方法,调整适当的参数,路径和文件名 . 我可以绘制准确度和损失(准确度> 90%),并且通过我编写的一个小的附加脚本,我可以输出哪些图像导致错误预测的标签 . 所有人似乎都处于良好状态 .

但是当我想加载我的caffe模型时,它会失败并且我的内核会死掉 .

到目前为止,我发现的原因导致caffemodel无法加载:

  • 错误的路径和/或文件名称为model_def和model_weights(很容易从错误消息中识别,这里内核不会死)

  • deploy.prototxt未正确定义 .

  • 它应遵循BVLC caffe wiki上的指导原则,即从 train.prototxt 中删除包含数据和标签的任何图层,并添加"softmax"类型的最终图层 .

  • 第一层尺寸设置不正确:输入图像的尺寸应与第一层中声明的尺寸相匹配( input_param { shape: { dim: 1 dim: 1 dim: 60 dim: 60 } }

  • caffemodel的大小错误(例如几KB),内容可以使用例如this script .

  • lmdb文件不正确(大小错误或未创建);看到create_imageset here的选项

  • I 've set the resize flag to false (it'有点违反直觉,调整大小设置为0表示没有调整大小但是

  • 我试图添加--gray标志以强制图像到1个通道并调整deploy.prototxt中的相应维度

除了这些原因,为什么在使用带有pycaffe的 net = caffe.Net(model_def,model_weights,caffe.TEST) (这里用于ipython笔记本)时,caffemodel可能无法加载?

Edit

在训练了模型之后,获得了一些令人鼓舞的测试结果,并且获得了很好的准确性,获得了 .caffemodel.snapshot 文件的快照,我执行以下2行时唯一的消息如下所示 .

model_def = '/home/Prog/caffe/examples/svi/lenet_auto_deploy_2.prototxt'
model_weights = '/home/Prog/caffe/examples/svi/custom_iter_100.caffemodel'

net = caffe.Net(model_def,model_weights,caffe.TEST)

enter image description here

编辑2:

更多信息在我启动jupyter笔记本的终端窗口输出 . 在这里,也是在运行 net = caffe.Net(model_def,model_weights,caffe.TEST) 之后,如上所述 .

W0927 21:01:04.047416  4685 _caffe.cpp:139] DEPRECATION WARNING - deprecated use of Python interface
W0927 21:01:04.047456  4685 _caffe.cpp:140] Use this instead (with the named "weights" parameter):
W0927 21:01:04.047472  4685 _caffe.cpp:142] Net('/home/Prog/caffe/examples/svi/lenet_auto_deploy_2.prototxt', 1, weights='/home/Prog/caffe/examples/svi/custom_iter_100.caffemodel')
I0927 21:01:04.047893  4685 net.cpp:51] Initializing net from parameters: 
state {
  phase: TEST
  level: 0
}
layer {
  name: "data"
  type: "Input"
  top: "data"
  input_param {
    shape {
      dim: 1
      dim: 1
      dim: 60
      dim: 60
    }
  }
}

[...为了节省空间,只显示第一个和最后一个LeNet网络的图层......]

layer {
  name: "prob"
  type: "SoftMax"
  bottom: "score"
  top: "prob"
}
I0927 21:01:04.048302  4685 layer_factory.hpp:77] Creating layer data
I0927 21:01:04.048334  4685 net.cpp:84] Creating Layer data
I0927 21:01:04.048352  4685 net.cpp:380] data -> data
I0927 21:01:04.048384  4685 net.cpp:122] Setting up data
I0927 21:01:04.048406  4685 net.cpp:129] Top shape: 1 1 60 60 (3600)
I0927 21:01:04.048418  4685 net.cpp:137] Memory required for data: 14400
I0927 21:01:04.048429  4685 layer_factory.hpp:77] Creating layer conv1
I0927 21:01:04.048451  4685 net.cpp:84] Creating Layer conv1
I0927 21:01:04.048465  4685 net.cpp:406] conv1 <- data
I0927 21:01:04.048480  4685 net.cpp:380] conv1 -> conv1
I0927 21:01:04.048539  4685 net.cpp:122] Setting up conv1
I0927 21:01:04.048558  4685 net.cpp:129] Top shape: 1 20 56 56 (62720)
I0927 21:01:04.048568  4685 net.cpp:137] Memory required for data: 265280
I0927 21:01:04.048591  4685 layer_factory.hpp:77] Creating layer pool1
I0927 21:01:04.048610  4685 net.cpp:84] Creating Layer pool1
I0927 21:01:04.048622  4685 net.cpp:406] pool1 <- conv1
I0927 21:01:04.048637  4685 net.cpp:380] pool1 -> pool1
I0927 21:01:04.048660  4685 net.cpp:122] Setting up pool1
I0927 21:01:04.048676  4685 net.cpp:129] Top shape: 1 20 28 28 (15680)
I0927 21:01:04.048686  4685 net.cpp:137] Memory required for data: 328000
I0927 21:01:04.048696  4685 layer_factory.hpp:77] Creating layer conv2
I0927 21:01:04.048713  4685 net.cpp:84] Creating Layer conv2
I0927 21:01:04.048724  4685 net.cpp:406] conv2 <- pool1
I0927 21:01:04.048738  4685 net.cpp:380] conv2 -> conv2
I0927 21:01:04.049101  4685 net.cpp:122] Setting up conv2
I0927 21:01:04.049123  4685 net.cpp:129] Top shape: 1 50 24 24 (28800)
I0927 21:01:04.049135  4685 net.cpp:137] Memory required for data: 443200
I0927 21:01:04.049156  4685 layer_factory.hpp:77] Creating layer pool2
I0927 21:01:04.049175  4685 net.cpp:84] Creating Layer pool2
I0927 21:01:04.049187  4685 net.cpp:406] pool2 <- conv2
I0927 21:01:04.049201  4685 net.cpp:380] pool2 -> pool2
I0927 21:01:04.049224  4685 net.cpp:122] Setting up pool2
I0927 21:01:04.049242  4685 net.cpp:129] Top shape: 1 50 12 12 (7200)
I0927 21:01:04.049253  4685 net.cpp:137] Memory required for data: 472000
I0927 21:01:04.049264  4685 layer_factory.hpp:77] Creating layer fc1
I0927 21:01:04.049280  4685 net.cpp:84] Creating Layer fc1
I0927 21:01:04.049293  4685 net.cpp:406] fc1 <- pool2
I0927 21:01:04.049309  4685 net.cpp:380] fc1 -> fc1
I0927 21:01:04.096449  4685 net.cpp:122] Setting up fc1
I0927 21:01:04.096500  4685 net.cpp:129] Top shape: 1 500 (500)
I0927 21:01:04.096515  4685 net.cpp:137] Memory required for data: 474000
I0927 21:01:04.096545  4685 layer_factory.hpp:77] Creating layer relu1
I0927 21:01:04.096570  4685 net.cpp:84] Creating Layer relu1
I0927 21:01:04.096585  4685 net.cpp:406] relu1 <- fc1
I0927 21:01:04.096602  4685 net.cpp:367] relu1 -> fc1 (in-place)
I0927 21:01:04.096624  4685 net.cpp:122] Setting up relu1
I0927 21:01:04.096640  4685 net.cpp:129] Top shape: 1 500 (500)
I0927 21:01:04.096652  4685 net.cpp:137] Memory required for data: 476000
I0927 21:01:04.096664  4685 layer_factory.hpp:77] Creating layer score
I0927 21:01:04.096683  4685 net.cpp:84] Creating Layer score
I0927 21:01:04.096694  4685 net.cpp:406] score <- fc1
I0927 21:01:04.096714  4685 net.cpp:380] score -> score
I0927 21:01:04.096935  4685 net.cpp:122] Setting up score
I0927 21:01:04.096953  4685 net.cpp:129] Top shape: 1 26 (26)
I0927 21:01:04.096967  4685 net.cpp:137] Memory required for data: 476104
I0927 21:01:04.096987  4685 layer_factory.hpp:77] Creating layer prob
F0927 21:01:04.097034  4685 layer_factory.hpp:81] Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: SoftMax (known types: AbsVal, Accuracy, ArgMax, BNLL, BatchNorm, BatchReindex, Bias, Concat, ContrastiveLoss, Convolution, Crop, Data, Deconvolution, Dropout, DummyData, ELU, Eltwise, Embed, EuclideanLoss, Exp, Filter, Flatten, HDF5Data, HDF5Output, HingeLoss, Im2col, ImageData, InfogainLoss, InnerProduct, Input, LRN, LSTM, LSTMUnit, Log, MVN, MemoryData, MultinomialLogisticLoss, PReLU, Parameter, Pooling, Power, Python, RNN, ReLU, Reduction, Reshape, SPP, Scale, Sigmoid, SigmoidCrossEntropyLoss, Silence, Slice, Softmax, SoftmaxWithLoss, Split, TanH, Threshold, Tile, WindowData)
*** Check failure stack trace: ***
[I 21:01:04.851 NotebookApp] KernelRestarter: restarting kernel (1/5)
WARNING:root:kernel d4f64b91-60d4-4fed-bf20-6ccce2018c10 restarted