Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caffe to Pytorch or Tensorflow unable to convert #916

Open
edmundang1994 opened this issue Feb 24, 2021 · 0 comments
Open

Caffe to Pytorch or Tensorflow unable to convert #916

edmundang1994 opened this issue Feb 24, 2021 · 0 comments

Comments

@edmundang1994
Copy link

edmundang1994 commented Feb 24, 2021

Platform (like ubuntu 16.04/win10): Ubuntu 20.04.1

Python version: 3.8.5

Source framework with version (like Tensorflow 1.4.1 with GPU): Caffe

Destination framework with version (like CNTK 2.3 with GPU): Pytorch or Tensorflow

Pre-trained model path (webpath or webdisk path):
trafficlightrecognition.zip

Running scripts:

I have tried to convert from caffe to pytorch and tensor, both gave the same error for conversion. Error code as shown below. Please help!!

zatch123129@ubuntu:~/Desktop/adversarial-av/MMdnn$ mmconvert -sf caffe -in /home/zatch123129/Desktop/adversarial-av/caffe_models/production/traffic_light_recognition/vertical/deploy.prototxt -iw /home/zatch123129/Desktop/adversarial-av/caffe_models/production/traffic_light_recognition/vertical/baidu_iter_250000.caffemodel -df pytorch -om caffe_baidu_iter_250000.dnn
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0224 00:38:06.458184 37356 net.cpp:58] Initializing net from parameters:
state {
phase: TEST
level: 0
}
layer {
name: "input"
type: "Input"
top: "data_org"
input_param {
shape {
dim: 1
dim: 3
dim: 32
dim: 96
}
}
}
layer {
name: "permute"
type: "Permute"
bottom: "data_org"
top: "data"
permute_param {
order: 0
order: 1
order: 3
order: 2
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 32
pad: 1
kernel_size: 3
stride: 1
weight_filler {
type: "msra"
}
bias_filler {
type: "constant"
value: 0
}
dilation: 1
}
}
layer {
name: "conv1_bn"
type: "BatchNorm"
bottom: "conv1"
top: "conv1"
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "conv1_bn_scale"
type: "Scale"
bottom: "conv1"
top: "conv1"
scale_param {
axis: 1
num_axes: 1
bias_term: false
}
}
layer {
name: "conv1_relu"
type: "ReLU"
bottom: "conv1"
top: "conv1"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1"
top: "pool1"
pooling_param {
pool: MAX
kernel_h: 3
kernel_w: 3
stride_h: 2
stride_w: 2
pad_h: 1
pad_w: 1
round_mode: FLOOR
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "pool1"
top: "conv2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
stride: 1
weight_filler {
type: "msra"
}
bias_filler {
type: "constant"
value: 0
}
dilation: 1
}
}
layer {
name: "conv2_bn"
type: "BatchNorm"
bottom: "conv2"
top: "conv2"
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "conv2_bn_scale"
type: "Scale"
bottom: "conv2"
top: "conv2"
scale_param {
axis: 1
num_axes: 1
bias_term: false
}
}
layer {
name: "conv2_relu"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2"
top: "pool2"
pooling_param {
pool: MAX
kernel_h: 3
kernel_w: 3
stride_h: 2
stride_w: 2
pad_h: 1
pad_w: 1
round_mode: FLOOR
}
}
layer {
name: "conv3"
type: "Convolution"
bottom: "pool2"
top: "conv3"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
stride: 1
weight_filler {
type: "msra"
}
bias_filler {
type: "constant"
value: 0
}
dilation: 1
}
}
layer {
name: "conv3_bn"
type: "BatchNorm"
bottom: "conv3"
top: "conv3"
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "conv3_bn_scale"
type: "Scale"
bottom: "conv3"
top: "conv3"
scale_param {
axis: 1
num_axes: 1
bias_term: false
}
}
layer {
name: "conv3_relu"
type: "ReLU"
bottom: "conv3"
top: "conv3"
}
layer {
name: "pool3"
type: "Pooling"
bottom: "conv3"
top: "pool3"
pooling_param {
pool: MAX
kernel_h: 3
kernel_w: 3
stride_h: 2
stride_w: 2
pad_h: 1
pad_w: 1
round_mode: FLOOR
}
}
layer {
name: "conv4"
type: "Convolution"
bottom: "pool3"
top: "conv4"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
stride: 1
weight_filler {
type: "msra"
}
bias_filler {
type: "constant"
value: 0
}
dilation: 1
}
}
layer {
name: "conv4_bn"
type: "BatchNorm"
bottom: "conv4"
top: "conv4"
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "conv4_bn_scale"
type: "Scale"
bottom: "conv4"
top: "conv4"
scale_param {
axis: 1
num_axes: 1
bias_term: false
}
}
layer {
name: "conv4_relu"
type: "ReLU"
bottom: "conv4"
top: "conv4"
}
layer {
name: "pool4"
type: "Pooling"
bottom: "conv4"
top: "pool4"
pooling_param {
pool: MAX
kernel_h: 3
kernel_w: 3
stride_h: 2
stride_w: 2
pad_h: 1
pad_w: 1
round_mode: FLOOR
}
}
layer {
name: "conv5"
type: "Convolution"
bottom: "pool4"
top: "conv5"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
stride: 1
weight_filler {
type: "msra"
}
bias_filler {
type: "constant"
value: 0
}
dilation: 1
}
}
layer {
name: "conv5_bn"
type: "BatchNorm"
bottom: "conv5"
top: "conv5"
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "conv5_bn_scale"
type: "Scale"
bottom: "conv5"
top: "conv5"
scale_param {
axis: 1
num_axes: 1
bias_term: false
}
}
layer {
name: "conv5_relu"
type: "ReLU"
bottom: "conv5"
top: "conv5"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5"
top: "pool5"
pooling_param {
pool: AVE
kernel_h: 6
kernel_w: 2
stride_h: 6
stride_w: 2
round_mode: FLOOR
}
}
layer {
name: "ft"
type: "InnerProduct"
bottom: "pool5"
top: "ft"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
inner_product_param {
num_output: 128
weight_filler {
type: "msra"
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "ft_bn"
type: "BatchNorm"
bottom: "ft"
top: "ft"
batch_norm_param {
use_global_stats: true
}
}
layer {
name: "ft_bn_scale"
type: "Scale"
bottom: "ft"
top: "ft"
scale_param {
axis: 1
num_axes: 1
bias_term: false
}
}
layer {
name: "ft_relu"
type: "ReLU"
bottom: "ft"
top: "ft"
}
layer {
name: "logits"
type: "InnerProduct"
bottom: "ft"
top: "logits"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
inner_product_param {
num_output: 4
weight_filler {
type: "msra"
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "prob"
type: "Softmax"
bottom: "logits"
top: "prob"
}
I0224 00:38:06.458712 37356 layer_factory.hpp:77] Creating layer input
I0224 00:38:06.458739 37356 net.cpp:100] Creating Layer input
I0224 00:38:06.458748 37356 net.cpp:408] input -> data_org
I0224 00:38:06.458777 37356 net.cpp:150] Setting up input
I0224 00:38:06.458786 37356 net.cpp:157] Top shape: 1 3 32 96 (9216)
I0224 00:38:06.458796 37356 net.cpp:165] Memory required for data: 36864
I0224 00:38:06.458803 37356 layer_factory.hpp:77] Creating layer permute
I0224 00:38:06.458819 37356 net.cpp:100] Creating Layer permute
I0224 00:38:06.458827 37356 net.cpp:434] permute <- data_org
I0224 00:38:06.458835 37356 net.cpp:408] permute -> data
I0224 00:38:06.458853 37356 net.cpp:150] Setting up permute
I0224 00:38:06.458861 37356 net.cpp:157] Top shape: 1 3 96 32 (9216)
I0224 00:38:06.458870 37356 net.cpp:165] Memory required for data: 73728
I0224 00:38:06.458878 37356 layer_factory.hpp:77] Creating layer conv1
I0224 00:38:06.458889 37356 net.cpp:100] Creating Layer conv1
I0224 00:38:06.458896 37356 net.cpp:434] conv1 <- data
I0224 00:38:06.458905 37356 net.cpp:408] conv1 -> conv1
I0224 00:38:06.459002 37356 net.cpp:150] Setting up conv1
I0224 00:38:06.459033 37356 net.cpp:157] Top shape: 1 32 96 32 (98304)
I0224 00:38:06.459044 37356 net.cpp:165] Memory required for data: 466944
I0224 00:38:06.459056 37356 layer_factory.hpp:77] Creating layer conv1_bn
I0224 00:38:06.459086 37356 net.cpp:100] Creating Layer conv1_bn
I0224 00:38:06.459098 37356 net.cpp:434] conv1_bn <- conv1
I0224 00:38:06.459107 37356 net.cpp:395] conv1_bn -> conv1 (in-place)
I0224 00:38:06.459129 37356 net.cpp:150] Setting up conv1_bn
I0224 00:38:06.459137 37356 net.cpp:157] Top shape: 1 32 96 32 (98304)
I0224 00:38:06.459146 37356 net.cpp:165] Memory required for data: 860160
I0224 00:38:06.459156 37356 layer_factory.hpp:77] Creating layer conv1_bn_scale
I0224 00:38:06.459172 37356 net.cpp:100] Creating Layer conv1_bn_scale
I0224 00:38:06.459179 37356 net.cpp:434] conv1_bn_scale <- conv1
I0224 00:38:06.459187 37356 net.cpp:395] conv1_bn_scale -> conv1 (in-place)
I0224 00:38:06.459218 37356 net.cpp:150] Setting up conv1_bn_scale
I0224 00:38:06.459226 37356 net.cpp:157] Top shape: 1 32 96 32 (98304)
I0224 00:38:06.459234 37356 net.cpp:165] Memory required for data: 1253376
I0224 00:38:06.459242 37356 layer_factory.hpp:77] Creating layer conv1_relu
I0224 00:38:06.459251 37356 net.cpp:100] Creating Layer conv1_relu
I0224 00:38:06.459259 37356 net.cpp:434] conv1_relu <- conv1
I0224 00:38:06.459265 37356 net.cpp:395] conv1_relu -> conv1 (in-place)
I0224 00:38:06.459278 37356 net.cpp:150] Setting up conv1_relu
I0224 00:38:06.459285 37356 net.cpp:157] Top shape: 1 32 96 32 (98304)
I0224 00:38:06.459293 37356 net.cpp:165] Memory required for data: 1646592
I0224 00:38:06.459300 37356 layer_factory.hpp:77] Creating layer pool1
I0224 00:38:06.459308 37356 net.cpp:100] Creating Layer pool1
I0224 00:38:06.459316 37356 net.cpp:434] pool1 <- conv1
I0224 00:38:06.459323 37356 net.cpp:408] pool1 -> pool1
I0224 00:38:06.459334 37356 net.cpp:150] Setting up pool1
I0224 00:38:06.459342 37356 net.cpp:157] Top shape: 1 32 48 16 (24576)
I0224 00:38:06.459350 37356 net.cpp:165] Memory required for data: 1744896
I0224 00:38:06.459357 37356 layer_factory.hpp:77] Creating layer conv2
I0224 00:38:06.459367 37356 net.cpp:100] Creating Layer conv2
I0224 00:38:06.459374 37356 net.cpp:434] conv2 <- pool1
I0224 00:38:06.459383 37356 net.cpp:408] conv2 -> conv2
I0224 00:38:06.459568 37356 net.cpp:150] Setting up conv2
I0224 00:38:06.459595 37356 net.cpp:157] Top shape: 1 64 48 16 (49152)
I0224 00:38:06.459606 37356 net.cpp:165] Memory required for data: 1941504
I0224 00:38:06.459615 37356 layer_factory.hpp:77] Creating layer conv2_bn
I0224 00:38:06.459627 37356 net.cpp:100] Creating Layer conv2_bn
I0224 00:38:06.459635 37356 net.cpp:434] conv2_bn <- conv2
I0224 00:38:06.459643 37356 net.cpp:395] conv2_bn -> conv2 (in-place)
I0224 00:38:06.459662 37356 net.cpp:150] Setting up conv2_bn
I0224 00:38:06.459671 37356 net.cpp:157] Top shape: 1 64 48 16 (49152)
I0224 00:38:06.459679 37356 net.cpp:165] Memory required for data: 2138112
I0224 00:38:06.459689 37356 layer_factory.hpp:77] Creating layer conv2_bn_scale
I0224 00:38:06.459699 37356 net.cpp:100] Creating Layer conv2_bn_scale
I0224 00:38:06.459707 37356 net.cpp:434] conv2_bn_scale <- conv2
I0224 00:38:06.459714 37356 net.cpp:395] conv2_bn_scale -> conv2 (in-place)
I0224 00:38:06.459729 37356 net.cpp:150] Setting up conv2_bn_scale
I0224 00:38:06.459736 37356 net.cpp:157] Top shape: 1 64 48 16 (49152)
I0224 00:38:06.459744 37356 net.cpp:165] Memory required for data: 2334720
I0224 00:38:06.459751 37356 layer_factory.hpp:77] Creating layer conv2_relu
I0224 00:38:06.459761 37356 net.cpp:100] Creating Layer conv2_relu
I0224 00:38:06.459769 37356 net.cpp:434] conv2_relu <- conv2
I0224 00:38:06.459776 37356 net.cpp:395] conv2_relu -> conv2 (in-place)
I0224 00:38:06.459784 37356 net.cpp:150] Setting up conv2_relu
I0224 00:38:06.459791 37356 net.cpp:157] Top shape: 1 64 48 16 (49152)
I0224 00:38:06.459798 37356 net.cpp:165] Memory required for data: 2531328
I0224 00:38:06.459805 37356 layer_factory.hpp:77] Creating layer pool2
I0224 00:38:06.459813 37356 net.cpp:100] Creating Layer pool2
I0224 00:38:06.459820 37356 net.cpp:434] pool2 <- conv2
I0224 00:38:06.459827 37356 net.cpp:408] pool2 -> pool2
I0224 00:38:06.459837 37356 net.cpp:150] Setting up pool2
I0224 00:38:06.459844 37356 net.cpp:157] Top shape: 1 64 24 8 (12288)
I0224 00:38:06.459852 37356 net.cpp:165] Memory required for data: 2580480
I0224 00:38:06.459858 37356 layer_factory.hpp:77] Creating layer conv3
I0224 00:38:06.459868 37356 net.cpp:100] Creating Layer conv3
I0224 00:38:06.459875 37356 net.cpp:434] conv3 <- pool2
I0224 00:38:06.459884 37356 net.cpp:408] conv3 -> conv3
I0224 00:38:06.460860 37356 net.cpp:150] Setting up conv3
I0224 00:38:06.460899 37356 net.cpp:157] Top shape: 1 128 24 8 (24576)
I0224 00:38:06.460918 37356 net.cpp:165] Memory required for data: 2678784
I0224 00:38:06.460937 37356 layer_factory.hpp:77] Creating layer conv3_bn
I0224 00:38:06.460955 37356 net.cpp:100] Creating Layer conv3_bn
I0224 00:38:06.460968 37356 net.cpp:434] conv3_bn <- conv3
I0224 00:38:06.460983 37356 net.cpp:395] conv3_bn -> conv3 (in-place)
I0224 00:38:06.461019 37356 net.cpp:150] Setting up conv3_bn
I0224 00:38:06.461033 37356 net.cpp:157] Top shape: 1 128 24 8 (24576)
I0224 00:38:06.461048 37356 net.cpp:165] Memory required for data: 2777088
I0224 00:38:06.461079 37356 layer_factory.hpp:77] Creating layer conv3_bn_scale
I0224 00:38:06.461129 37356 net.cpp:100] Creating Layer conv3_bn_scale
I0224 00:38:06.461151 37356 net.cpp:434] conv3_bn_scale <- conv3
I0224 00:38:06.461174 37356 net.cpp:395] conv3_bn_scale -> conv3 (in-place)
I0224 00:38:06.461195 37356 net.cpp:150] Setting up conv3_bn_scale
I0224 00:38:06.461203 37356 net.cpp:157] Top shape: 1 128 24 8 (24576)
I0224 00:38:06.461213 37356 net.cpp:165] Memory required for data: 2875392
I0224 00:38:06.461221 37356 layer_factory.hpp:77] Creating layer conv3_relu
I0224 00:38:06.461236 37356 net.cpp:100] Creating Layer conv3_relu
I0224 00:38:06.461243 37356 net.cpp:434] conv3_relu <- conv3
I0224 00:38:06.461251 37356 net.cpp:395] conv3_relu -> conv3 (in-place)
I0224 00:38:06.461261 37356 net.cpp:150] Setting up conv3_relu
I0224 00:38:06.461268 37356 net.cpp:157] Top shape: 1 128 24 8 (24576)
I0224 00:38:06.461277 37356 net.cpp:165] Memory required for data: 2973696
I0224 00:38:06.461283 37356 layer_factory.hpp:77] Creating layer pool3
I0224 00:38:06.461293 37356 net.cpp:100] Creating Layer pool3
I0224 00:38:06.461302 37356 net.cpp:434] pool3 <- conv3
I0224 00:38:06.461310 37356 net.cpp:408] pool3 -> pool3
I0224 00:38:06.461323 37356 net.cpp:150] Setting up pool3
I0224 00:38:06.461331 37356 net.cpp:157] Top shape: 1 128 12 4 (6144)
I0224 00:38:06.461339 37356 net.cpp:165] Memory required for data: 2998272
I0224 00:38:06.461347 37356 layer_factory.hpp:77] Creating layer conv4
I0224 00:38:06.461362 37356 net.cpp:100] Creating Layer conv4
I0224 00:38:06.461371 37356 net.cpp:434] conv4 <- pool3
I0224 00:38:06.461380 37356 net.cpp:408] conv4 -> conv4
I0224 00:38:06.463476 37356 net.cpp:150] Setting up conv4
I0224 00:38:06.463543 37356 net.cpp:157] Top shape: 1 128 12 4 (6144)
I0224 00:38:06.463569 37356 net.cpp:165] Memory required for data: 3022848
I0224 00:38:06.463596 37356 layer_factory.hpp:77] Creating layer conv4_bn
I0224 00:38:06.463631 37356 net.cpp:100] Creating Layer conv4_bn
I0224 00:38:06.463650 37356 net.cpp:434] conv4_bn <- conv4
I0224 00:38:06.463671 37356 net.cpp:395] conv4_bn -> conv4 (in-place)
I0224 00:38:06.463719 37356 net.cpp:150] Setting up conv4_bn
I0224 00:38:06.463735 37356 net.cpp:157] Top shape: 1 128 12 4 (6144)
I0224 00:38:06.463754 37356 net.cpp:165] Memory required for data: 3047424
I0224 00:38:06.463774 37356 layer_factory.hpp:77] Creating layer conv4_bn_scale
I0224 00:38:06.463801 37356 net.cpp:100] Creating Layer conv4_bn_scale
I0224 00:38:06.463819 37356 net.cpp:434] conv4_bn_scale <- conv4
I0224 00:38:06.463838 37356 net.cpp:395] conv4_bn_scale -> conv4 (in-place)
I0224 00:38:06.463871 37356 net.cpp:150] Setting up conv4_bn_scale
I0224 00:38:06.463887 37356 net.cpp:157] Top shape: 1 128 12 4 (6144)
I0224 00:38:06.463907 37356 net.cpp:165] Memory required for data: 3072000
I0224 00:38:06.463925 37356 layer_factory.hpp:77] Creating layer conv4_relu
I0224 00:38:06.463948 37356 net.cpp:100] Creating Layer conv4_relu
I0224 00:38:06.463965 37356 net.cpp:434] conv4_relu <- conv4
I0224 00:38:06.463984 37356 net.cpp:395] conv4_relu -> conv4 (in-place)
I0224 00:38:06.464004 37356 net.cpp:150] Setting up conv4_relu
I0224 00:38:06.464020 37356 net.cpp:157] Top shape: 1 128 12 4 (6144)
I0224 00:38:06.464038 37356 net.cpp:165] Memory required for data: 3096576
I0224 00:38:06.464058 37356 layer_factory.hpp:77] Creating layer pool4
I0224 00:38:06.464078 37356 net.cpp:100] Creating Layer pool4
I0224 00:38:06.464145 37356 net.cpp:434] pool4 <- conv4
I0224 00:38:06.464185 37356 net.cpp:408] pool4 -> pool4
I0224 00:38:06.464308 37356 net.cpp:150] Setting up pool4
I0224 00:38:06.464329 37356 net.cpp:157] Top shape: 1 128 6 2 (1536)
I0224 00:38:06.464339 37356 net.cpp:165] Memory required for data: 3102720
I0224 00:38:06.464347 37356 layer_factory.hpp:77] Creating layer conv5
I0224 00:38:06.464388 37356 net.cpp:100] Creating Layer conv5
I0224 00:38:06.464397 37356 net.cpp:434] conv5 <- pool4
I0224 00:38:06.464408 37356 net.cpp:408] conv5 -> conv5
I0224 00:38:06.465965 37356 net.cpp:150] Setting up conv5
I0224 00:38:06.465982 37356 net.cpp:157] Top shape: 1 128 6 2 (1536)
I0224 00:38:06.465991 37356 net.cpp:165] Memory required for data: 3108864
I0224 00:38:06.466001 37356 layer_factory.hpp:77] Creating layer conv5_bn
I0224 00:38:06.466014 37356 net.cpp:100] Creating Layer conv5_bn
I0224 00:38:06.466022 37356 net.cpp:434] conv5_bn <- conv5
I0224 00:38:06.466030 37356 net.cpp:395] conv5_bn -> conv5 (in-place)
I0224 00:38:06.466050 37356 net.cpp:150] Setting up conv5_bn
I0224 00:38:06.466058 37356 net.cpp:157] Top shape: 1 128 6 2 (1536)
I0224 00:38:06.466065 37356 net.cpp:165] Memory required for data: 3115008
I0224 00:38:06.466074 37356 layer_factory.hpp:77] Creating layer conv5_bn_scale
I0224 00:38:06.466131 37356 net.cpp:100] Creating Layer conv5_bn_scale
I0224 00:38:06.466145 37356 net.cpp:434] conv5_bn_scale <- conv5
I0224 00:38:06.466154 37356 net.cpp:395] conv5_bn_scale -> conv5 (in-place)
I0224 00:38:06.466171 37356 net.cpp:150] Setting up conv5_bn_scale
I0224 00:38:06.466178 37356 net.cpp:157] Top shape: 1 128 6 2 (1536)
I0224 00:38:06.466187 37356 net.cpp:165] Memory required for data: 3121152
I0224 00:38:06.466194 37356 layer_factory.hpp:77] Creating layer conv5_relu
I0224 00:38:06.466205 37356 net.cpp:100] Creating Layer conv5_relu
I0224 00:38:06.466212 37356 net.cpp:434] conv5_relu <- conv5
I0224 00:38:06.466219 37356 net.cpp:395] conv5_relu -> conv5 (in-place)
I0224 00:38:06.466228 37356 net.cpp:150] Setting up conv5_relu
I0224 00:38:06.466234 37356 net.cpp:157] Top shape: 1 128 6 2 (1536)
I0224 00:38:06.466243 37356 net.cpp:165] Memory required for data: 3127296
I0224 00:38:06.466248 37356 layer_factory.hpp:77] Creating layer pool5
I0224 00:38:06.466256 37356 net.cpp:100] Creating Layer pool5
I0224 00:38:06.466264 37356 net.cpp:434] pool5 <- conv5
I0224 00:38:06.466271 37356 net.cpp:408] pool5 -> pool5
I0224 00:38:06.466281 37356 net.cpp:150] Setting up pool5
I0224 00:38:06.466289 37356 net.cpp:157] Top shape: 1 128 1 1 (128)
I0224 00:38:06.466296 37356 net.cpp:165] Memory required for data: 3127808
I0224 00:38:06.466302 37356 layer_factory.hpp:77] Creating layer ft
I0224 00:38:06.466318 37356 net.cpp:100] Creating Layer ft
I0224 00:38:06.466326 37356 net.cpp:434] ft <- pool5
I0224 00:38:06.466334 37356 net.cpp:408] ft -> ft
I0224 00:38:06.466523 37356 net.cpp:150] Setting up ft
I0224 00:38:06.466532 37356 net.cpp:157] Top shape: 1 128 (128)
I0224 00:38:06.466539 37356 net.cpp:165] Memory required for data: 3128320
I0224 00:38:06.466548 37356 layer_factory.hpp:77] Creating layer ft_bn
I0224 00:38:06.466559 37356 net.cpp:100] Creating Layer ft_bn
I0224 00:38:06.466567 37356 net.cpp:434] ft_bn <- ft
I0224 00:38:06.466575 37356 net.cpp:395] ft_bn -> ft (in-place)
I0224 00:38:06.466594 37356 net.cpp:150] Setting up ft_bn
I0224 00:38:06.466601 37356 net.cpp:157] Top shape: 1 128 (128)
I0224 00:38:06.466609 37356 net.cpp:165] Memory required for data: 3128832
I0224 00:38:06.466622 37356 layer_factory.hpp:77] Creating layer ft_bn_scale
I0224 00:38:06.466634 37356 net.cpp:100] Creating Layer ft_bn_scale
I0224 00:38:06.466641 37356 net.cpp:434] ft_bn_scale <- ft
I0224 00:38:06.466650 37356 net.cpp:395] ft_bn_scale -> ft (in-place)
I0224 00:38:06.466662 37356 net.cpp:150] Setting up ft_bn_scale
I0224 00:38:06.466670 37356 net.cpp:157] Top shape: 1 128 (128)
I0224 00:38:06.466677 37356 net.cpp:165] Memory required for data: 3129344
I0224 00:38:06.466684 37356 layer_factory.hpp:77] Creating layer ft_relu
I0224 00:38:06.466693 37356 net.cpp:100] Creating Layer ft_relu
I0224 00:38:06.466701 37356 net.cpp:434] ft_relu <- ft
I0224 00:38:06.466708 37356 net.cpp:395] ft_relu -> ft (in-place)
I0224 00:38:06.466717 37356 net.cpp:150] Setting up ft_relu
I0224 00:38:06.466723 37356 net.cpp:157] Top shape: 1 128 (128)
I0224 00:38:06.466730 37356 net.cpp:165] Memory required for data: 3129856
I0224 00:38:06.466737 37356 layer_factory.hpp:77] Creating layer logits
I0224 00:38:06.466747 37356 net.cpp:100] Creating Layer logits
I0224 00:38:06.466754 37356 net.cpp:434] logits <- ft
I0224 00:38:06.466763 37356 net.cpp:408] logits -> logits
I0224 00:38:06.466784 37356 net.cpp:150] Setting up logits
I0224 00:38:06.466791 37356 net.cpp:157] Top shape: 1 4 (4)
I0224 00:38:06.466799 37356 net.cpp:165] Memory required for data: 3129872
I0224 00:38:06.466807 37356 layer_factory.hpp:77] Creating layer prob
I0224 00:38:06.466820 37356 net.cpp:100] Creating Layer prob
I0224 00:38:06.466827 37356 net.cpp:434] prob <- logits
I0224 00:38:06.466835 37356 net.cpp:408] prob -> prob
I0224 00:38:06.466847 37356 net.cpp:150] Setting up prob
I0224 00:38:06.466854 37356 net.cpp:157] Top shape: 1 4 (4)
I0224 00:38:06.466861 37356 net.cpp:165] Memory required for data: 3129888
I0224 00:38:06.466868 37356 net.cpp:228] prob does not need backward computation.
I0224 00:38:06.466878 37356 net.cpp:228] logits does not need backward computation.
I0224 00:38:06.466886 37356 net.cpp:228] ft_relu does not need backward computation.
I0224 00:38:06.466892 37356 net.cpp:228] ft_bn_scale does not need backward computation.
I0224 00:38:06.466898 37356 net.cpp:228] ft_bn does not need backward computation.
I0224 00:38:06.466905 37356 net.cpp:228] ft does not need backward computation.
I0224 00:38:06.466913 37356 net.cpp:228] pool5 does not need backward computation.
I0224 00:38:06.466920 37356 net.cpp:228] conv5_relu does not need backward computation.
I0224 00:38:06.466926 37356 net.cpp:228] conv5_bn_scale does not need backward computation.
I0224 00:38:06.466933 37356 net.cpp:228] conv5_bn does not need backward computation.
I0224 00:38:06.466940 37356 net.cpp:228] conv5 does not need backward computation.
I0224 00:38:06.466948 37356 net.cpp:228] pool4 does not need backward computation.
I0224 00:38:06.466955 37356 net.cpp:228] conv4_relu does not need backward computation.
I0224 00:38:06.466962 37356 net.cpp:228] conv4_bn_scale does not need backward computation.
I0224 00:38:06.466969 37356 net.cpp:228] conv4_bn does not need backward computation.
I0224 00:38:06.466976 37356 net.cpp:228] conv4 does not need backward computation.
I0224 00:38:06.466984 37356 net.cpp:228] pool3 does not need backward computation.
I0224 00:38:06.466991 37356 net.cpp:228] conv3_relu does not need backward computation.
I0224 00:38:06.466997 37356 net.cpp:228] conv3_bn_scale does not need backward computation.
I0224 00:38:06.467005 37356 net.cpp:228] conv3_bn does not need backward computation.
I0224 00:38:06.467011 37356 net.cpp:228] conv3 does not need backward computation.
I0224 00:38:06.467020 37356 net.cpp:228] pool2 does not need backward computation.
I0224 00:38:06.467026 37356 net.cpp:228] conv2_relu does not need backward computation.
I0224 00:38:06.467033 37356 net.cpp:228] conv2_bn_scale does not need backward computation.
I0224 00:38:06.467041 37356 net.cpp:228] conv2_bn does not need backward computation.
I0224 00:38:06.467047 37356 net.cpp:228] conv2 does not need backward computation.
I0224 00:38:06.467054 37356 net.cpp:228] pool1 does not need backward computation.
I0224 00:38:06.467062 37356 net.cpp:228] conv1_relu does not need backward computation.
I0224 00:38:06.467069 37356 net.cpp:228] conv1_bn_scale does not need backward computation.
I0224 00:38:06.467092 37356 net.cpp:228] conv1_bn does not need backward computation.
I0224 00:38:06.467106 37356 net.cpp:228] conv1 does not need backward computation.
I0224 00:38:06.467114 37356 net.cpp:228] permute does not need backward computation.
I0224 00:38:06.467121 37356 net.cpp:228] input does not need backward computation.
I0224 00:38:06.467128 37356 net.cpp:270] This network produces output prob
I0224 00:38:06.467149 37356 net.cpp:283] Network initialization done.
Traceback (most recent call last):
File "/home/zatch123129/.local/bin/mmconvert", line 8, in
sys.exit(_main())
File "/home/zatch123129/Desktop/adversarial-av/MMdnn/mmdnn/conversion/_script/convert.py", line 102, in _main
ret = convertToIR._convert(ir_args)
File "/home/zatch123129/Desktop/adversarial-av/MMdnn/mmdnn/conversion/_script/convertToIR.py", line 16, in _convert
transformer = CaffeTransformer(args.network, args.weights, "tensorflow", inputshape[0], phase = args.caffePhase)
File "/home/zatch123129/Desktop/adversarial-av/MMdnn/mmdnn/conversion/caffe/transformer.py", line 325, in init
graph = GraphBuilder(def_path, self.input_shape, self.is_train_proto, phase).build()
File "/home/zatch123129/Desktop/adversarial-av/MMdnn/mmdnn/conversion/caffe/graph.py", line 454, in build
graph.compute_output_shapes(self.model)
File "/home/zatch123129/Desktop/adversarial-av/MMdnn/mmdnn/conversion/caffe/graph.py", line 274, in compute_output_shapes
node.output_shape = TensorShape(*NodeKind.compute_output_shape(node))
File "/home/zatch123129/Desktop/adversarial-av/MMdnn/mmdnn/conversion/caffe/graph.py", line 133, in compute_output_shape
return LAYER_DESCRIPTORSnode.kind
KeyError: None

image
I noticed the above error in graph.py, it seems like NodeKind wasn't able to read the different layers, anyone able to help with this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant