cp-decomposition's People
Forkers
mrgloom peratham shaohuawan vzvzx arsenluca caomw liuguoyou gaojie0105 hklee2040 afcarl hainuyxg franciszchen aslyc zhanglang1860 mty9678 logichen hixio-mh tripleesscp-decomposition's Issues
CP-decompositon on AlexNet
Hello!
I'm running the code to decompose the conv2 of AlexNet. But when I tested by D:\Softwares\caffe\build\tools\Release\caffe.exe test --model AlexNet_accelerated.prototxt -weights AlexNet_accelerated.caffemodel
, the accuracy is 0. No error or warning. I changed some codes which is bold in config_processing.py as below.
def accelerate_model(model, layer_to_decompose, rank):
k = layer_to_decompose
r = rank
new_model = caffe.proto.caffe_pb2.NetParameter()
for i in range(k):
new_model.layer.extend([model.layer[i]])
decomposed_layer = model.layer[k]
if decomposed_layer.type != 'Convolution':
raise AttributeError('only convolution layer can be decomposed')
param = decomposed_layer.convolution_param
if not hasattr(param, 'pad'):
param.pad = [0]
if param.pad == []:
param.pad.append(0)
if not hasattr(param, 'stride'):
param.stride = [1]
if param.stride == []:
param.stride.append(1)
new_model.layer.extend([conv_layer(1, 1, r, 2)]) #here I changed
new_model.layer.extend([conv_layer(param.kernel_size[0], 1, r, r, pad_h=param.pad[0], stride_h=param.stride[0])])
new_model.layer.extend([conv_layer(1, param.kernel_size[0], r, r, pad_w=param.pad[0], stride_w=param.stride[0])])
new_model.layer.extend([conv_layer(1, 1, param.num_output)])
name = decomposed_layer.name
for i in range(4):
new_model.layer[k+i].name = name + '-' + str(i + 1)
new_model.layer[k+i].bottom.extend([name + '-' + str(i)])
new_model.layer[k+i].top.extend([name + '-' + str(i + 1)])
new_model.layer[k].bottom[0] = model.layer[k].bottom[0]
new_model.layer[k+3].top[0] = model.layer[k].top[0]
for i in range(k+1, len(model.layer)):
new_model.layer.extend([model.layer[i]])
return new_model
This is the prototxt of AlexNet. I also tried to decompose the conv3, which group is 1 and works well (the group of conv2 is 2). So I think what I changed must be wrong. What should I change if I want to decompose the conv2? Thank you!
name: "AlexNet"
layer {
name: "data"
type: "Data"
top: "data"
top: "label"
include {
phase: TRAIN
}
transform_param {
mirror: true
crop_size: 227
mean_file: "D:/study/opensource/ILSVRC2012/ilsvrc12/imagenet_mean.binaryproto"
}
data_param {
source: "examples/imagenet/ilsvrc12_train_lmdb"
batch_size: 256
backend: LMDB
}
}
layer {
name: "data"
type: "Data"
top: "data"
top: "label"
include {
phase: TEST
}
transform_param {
mirror: false
crop_size: 227
mean_file: "D:/study/opensource/ILSVRC2012/ilsvrc12/imagenet_mean.binaryproto"
}
data_param {
source: "D:/study/opensource/ILSVRC2012/ilsvrc12_val_lmdb"
batch_size: 50
backend: LMDB
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 96
kernel_size: 11
stride: 4
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "conv1"
top: "conv1"
}
layer {
name: "norm1"
type: "LRN"
bottom: "conv1"
top: "norm1"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "pool1"
type: "Pooling"
bottom: "norm1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "pool1"
top: "conv2"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 256
pad: 2
kernel_size: 5
group: 2
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu2"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "norm2"
type: "LRN"
bottom: "conv2"
top: "norm2"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "pool2"
type: "Pooling"
bottom: "norm2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "conv3"
type: "Convolution"
bottom: "pool2"
top: "conv3"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "relu3"
type: "ReLU"
bottom: "conv3"
top: "conv3"
}
layer {
name: "conv4"
type: "Convolution"
bottom: "conv3"
top: "conv4"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
group: 2
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu4"
type: "ReLU"
bottom: "conv4"
top: "conv4"
}
layer {
name: "conv5"
type: "Convolution"
bottom: "conv4"
top: "conv5"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
group: 2
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu5"
type: "ReLU"
bottom: "conv5"
top: "conv5"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5"
top: "pool5"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "fc6"
type: "InnerProduct"
bottom: "pool5"
top: "fc6"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
inner_product_param {
num_output: 4096
weight_filler {
type: "gaussian"
std: 0.005
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "InnerProduct"
bottom: "fc6"
top: "fc7"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
inner_product_param {
num_output: 4096
weight_filler {
type: "gaussian"
std: 0.005
}
bias_filler {
type: "constant"
value: 0.1
}
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "fc7"
top: "fc7"
}
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc8"
type: "InnerProduct"
bottom: "fc7"
top: "fc8"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
inner_product_param {
num_output: 1000
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "accuracy"
type: "Accuracy"
bottom: "fc8"
bottom: "label"
top: "accuracy"
include {
phase: TEST
}
}
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "fc8"
bottom: "label"
top: "loss"
}
IndexError: Index out of range
I have run the code main.py
, but there was an error. Could anyone have a look at it? Thank you!
Traceback (most recent call last):
File "./lenet/main.py", line 17, in <module>
prepare_models(LAYER, R, NET_PATH, NET_NAME, INPUT_DIM)
File "/home/yizhou/test/cp-decomposition/config_processing.py", line 107, in prepare_models
w = net.layers[l].blobs[0].data
IndexError: Index out of range
The INFO before the error is like below.
WARNING: Logging before InitGoogleLogging() is written to STDERR
W0228 22:27:30.239754 19244 _caffe.cpp:135] DEPRECATION WARNING - deprecated use of Python interface
W0228 22:27:30.239804 19244 _caffe.cpp:136] Use this instead (with the named "weights" parameter):
W0228 22:27:30.239809 19244 _caffe.cpp:138] Net('lenet/lenet_deploy.prototxt', 1, weights='lenet/lenet.caffemodel')
I0228 22:27:30.242651 19244 upgrade_proto.cpp:67] Attempting to upgrade input file specified using deprecated input fields: lenet/lenet_deploy.prototxt
I0228 22:27:30.242672 19244 upgrade_proto.cpp:70] Successfully upgraded file specified using deprecated input fields.
W0228 22:27:30.242678 19244 upgrade_proto.cpp:72] Note that future Caffe releases will only support input layers and not input fields.
I0228 22:27:31.254855 19244 net.cpp:51] Initializing net from parameters:
state {
phase: TEST
level: 0
}
layer {
name: "input"
type: "Input"
top: "data"
input_param {
shape {
dim: 64
dim: 1
dim: 28
dim: 28
}
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 20
pad: 0
kernel_size: 5
stride: 1
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "pool1"
top: "conv2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 50
kernel_size: 5
stride: 1
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "ip1"
type: "InnerProduct"
bottom: "pool2"
top: "ip1"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 500
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "ip1"
top: "ip1"
}
layer {
name: "ip2"
type: "InnerProduct"
bottom: "ip1"
top: "ip2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 10
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
I0228 22:27:31.255280 19244 layer_factory.hpp:77] Creating layer input
I0228 22:27:31.255301 19244 net.cpp:84] Creating Layer input
I0228 22:27:31.255309 19244 net.cpp:380] input -> data
I0228 22:27:31.255336 19244 net.cpp:122] Setting up input
I0228 22:27:31.255348 19244 net.cpp:129] Top shape: 64 1 28 28 (50176)
I0228 22:27:31.255354 19244 net.cpp:137] Memory required for data: 200704
I0228 22:27:31.255360 19244 layer_factory.hpp:77] Creating layer conv1
I0228 22:27:31.291152 19244 net.cpp:84] Creating Layer conv1
I0228 22:27:31.291191 19244 net.cpp:406] conv1 <- data
I0228 22:27:31.291216 19244 net.cpp:380] conv1 -> conv1
I0228 22:27:31.292297 19244 net.cpp:122] Setting up conv1
I0228 22:27:31.292326 19244 net.cpp:129] Top shape: 64 20 24 24 (737280)
I0228 22:27:31.292331 19244 net.cpp:137] Memory required for data: 3149824
I0228 22:27:31.292357 19244 layer_factory.hpp:77] Creating layer pool1
I0228 22:27:31.292383 19244 net.cpp:84] Creating Layer pool1
I0228 22:27:31.292390 19244 net.cpp:406] pool1 <- conv1
I0228 22:27:31.292397 19244 net.cpp:380] pool1 -> pool1
I0228 22:27:31.292413 19244 net.cpp:122] Setting up pool1
I0228 22:27:31.292420 19244 net.cpp:129] Top shape: 64 20 12 12 (184320)
I0228 22:27:31.292429 19244 net.cpp:137] Memory required for data: 3887104
I0228 22:27:31.292433 19244 layer_factory.hpp:77] Creating layer conv2
I0228 22:27:31.292448 19244 net.cpp:84] Creating Layer conv2
I0228 22:27:31.292454 19244 net.cpp:406] conv2 <- pool1
I0228 22:27:31.292461 19244 net.cpp:380] conv2 -> conv2
I0228 22:27:31.292645 19244 net.cpp:122] Setting up conv2
I0228 22:27:31.292655 19244 net.cpp:129] Top shape: 64 50 8 8 (204800)
I0228 22:27:31.292660 19244 net.cpp:137] Memory required for data: 4706304
I0228 22:27:31.292670 19244 layer_factory.hpp:77] Creating layer pool2
I0228 22:27:31.292678 19244 net.cpp:84] Creating Layer pool2
I0228 22:27:31.292685 19244 net.cpp:406] pool2 <- conv2
I0228 22:27:31.292692 19244 net.cpp:380] pool2 -> pool2
I0228 22:27:31.292702 19244 net.cpp:122] Setting up pool2
I0228 22:27:31.292709 19244 net.cpp:129] Top shape: 64 50 4 4 (51200)
I0228 22:27:31.292714 19244 net.cpp:137] Memory required for data: 4911104
I0228 22:27:31.292719 19244 layer_factory.hpp:77] Creating layer ip1
I0228 22:27:31.292727 19244 net.cpp:84] Creating Layer ip1
I0228 22:27:31.292732 19244 net.cpp:406] ip1 <- pool2
I0228 22:27:31.292740 19244 net.cpp:380] ip1 -> ip1
I0228 22:27:31.295491 19244 net.cpp:122] Setting up ip1
I0228 22:27:31.295503 19244 net.cpp:129] Top shape: 64 500 (32000)
I0228 22:27:31.295508 19244 net.cpp:137] Memory required for data: 5039104
I0228 22:27:31.295517 19244 layer_factory.hpp:77] Creating layer relu1
I0228 22:27:31.295526 19244 net.cpp:84] Creating Layer relu1
I0228 22:27:31.295532 19244 net.cpp:406] relu1 <- ip1
I0228 22:27:31.295542 19244 net.cpp:367] relu1 -> ip1 (in-place)
I0228 22:27:31.295552 19244 net.cpp:122] Setting up relu1
I0228 22:27:31.295557 19244 net.cpp:129] Top shape: 64 500 (32000)
I0228 22:27:31.295563 19244 net.cpp:137] Memory required for data: 5167104
I0228 22:27:31.295568 19244 layer_factory.hpp:77] Creating layer ip2
I0228 22:27:31.295574 19244 net.cpp:84] Creating Layer ip2
I0228 22:27:31.295580 19244 net.cpp:406] ip2 <- ip1
I0228 22:27:31.295589 19244 net.cpp:380] ip2 -> ip2
I0228 22:27:31.295637 19244 net.cpp:122] Setting up ip2
I0228 22:27:31.295645 19244 net.cpp:129] Top shape: 64 10 (640)
I0228 22:27:31.295650 19244 net.cpp:137] Memory required for data: 5169664
I0228 22:27:31.295656 19244 net.cpp:200] ip2 does not need backward computation.
I0228 22:27:31.295662 19244 net.cpp:200] relu1 does not need backward computation.
I0228 22:27:31.295667 19244 net.cpp:200] ip1 does not need backward computation.
I0228 22:27:31.295672 19244 net.cpp:200] pool2 does not need backward computation.
I0228 22:27:31.295676 19244 net.cpp:200] conv2 does not need backward computation.
I0228 22:27:31.295681 19244 net.cpp:200] pool1 does not need backward computation.
I0228 22:27:31.295687 19244 net.cpp:200] conv1 does not need backward computation.
I0228 22:27:31.295693 19244 net.cpp:200] input does not need backward computation.
I0228 22:27:31.295698 19244 net.cpp:242] This network produces output ip2
I0228 22:27:31.295706 19244 net.cpp:255] Network initialization done.
I0228 22:27:31.328126 19244 upgrade_proto.cpp:53] Attempting to upgrade input file specified using deprecated V1LayerParameter: lenet/lenet.caffemodel
I0228 22:27:31.329046 19244 upgrade_proto.cpp:61] Successfully upgraded file specified using deprecated V1LayerParameter
I0228 22:27:31.329071 19244 net.cpp:744] Ignoring source layer mnist
I0228 22:27:31.329288 19244 net.cpp:744] Ignoring source layer loss
W0228 22:27:31.331320 19244 _caffe.cpp:135] DEPRECATION WARNING - deprecated use of Python interface
W0228 22:27:31.331334 19244 _caffe.cpp:136] Use this instead (with the named "weights" parameter):
W0228 22:27:31.331339 19244 _caffe.cpp:138] Net('lenet/lenet_accelerated_deploy.prototxt', 1, weights='lenet/lenet.caffemodel')
I0228 22:27:31.332758 19244 upgrade_proto.cpp:67] Attempting to upgrade input file specified using deprecated input fields: lenet/lenet_accelerated_deploy.prototxt
I0228 22:27:31.332777 19244 upgrade_proto.cpp:70] Successfully upgraded file specified using deprecated input fields.
W0228 22:27:31.332782 19244 upgrade_proto.cpp:72] Note that future Caffe releases will only support input layers and not input fields.
I0228 22:27:31.332880 19244 net.cpp:51] Initializing net from parameters:
state {
phase: TEST
level: 0
}
layer {
name: "input"
type: "Input"
top: "data"
input_param {
shape {
dim: 64
dim: 1
dim: 28
dim: 28
}
}
}
layer {
name: "conv1-1"
type: "Convolution"
bottom: "data"
top: "conv1-1"
convolution_param {
num_output: 4
kernel_size: 1
}
}
layer {
name: "conv1-2"
type: "Convolution"
bottom: "conv1-1"
top: "conv1-2"
convolution_param {
num_output: 4
group: 4
kernel_h: 5
kernel_w: 1
}
}
layer {
name: "conv1-3"
type: "Convolution"
bottom: "conv1-2"
top: "conv1-3"
convolution_param {
num_output: 4
group: 4
kernel_h: 1
kernel_w: 5
}
}
layer {
name: "conv1-4"
type: "Convolution"
bottom: "conv1-3"
top: "conv1"
convolution_param {
num_output: 20
kernel_size: 1
}
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "pool1"
top: "conv2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 50
kernel_size: 5
stride: 1
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "ip1"
type: "InnerProduct"
bottom: "pool2"
top: "ip1"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 500
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "ip1"
top: "ip1"
}
layer {
name: "ip2"
type: "InnerProduct"
bottom: "ip1"
top: "ip2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 10
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
I0228 22:27:31.333292 19244 layer_factory.hpp:77] Creating layer input
I0228 22:27:31.333300 19244 net.cpp:84] Creating Layer input
I0228 22:27:31.333310 19244 net.cpp:380] input -> data
I0228 22:27:31.333324 19244 net.cpp:122] Setting up input
I0228 22:27:31.333333 19244 net.cpp:129] Top shape: 64 1 28 28 (50176)
I0228 22:27:31.333339 19244 net.cpp:137] Memory required for data: 200704
I0228 22:27:31.333344 19244 layer_factory.hpp:77] Creating layer conv1-1
I0228 22:27:31.333353 19244 net.cpp:84] Creating Layer conv1-1
I0228 22:27:31.333358 19244 net.cpp:406] conv1-1 <- data
I0228 22:27:31.333364 19244 net.cpp:380] conv1-1 -> conv1-1
I0228 22:27:31.333382 19244 net.cpp:122] Setting up conv1-1
I0228 22:27:31.333390 19244 net.cpp:129] Top shape: 64 4 28 28 (200704)
I0228 22:27:31.333395 19244 net.cpp:137] Memory required for data: 1003520
I0228 22:27:31.333405 19244 layer_factory.hpp:77] Creating layer conv1-2
I0228 22:27:31.333412 19244 net.cpp:84] Creating Layer conv1-2
I0228 22:27:31.333418 19244 net.cpp:406] conv1-2 <- conv1-1
I0228 22:27:31.333425 19244 net.cpp:380] conv1-2 -> conv1-2
I0228 22:27:31.333438 19244 net.cpp:122] Setting up conv1-2
I0228 22:27:31.333446 19244 net.cpp:129] Top shape: 64 4 24 28 (172032)
I0228 22:27:31.333451 19244 net.cpp:137] Memory required for data: 1691648
I0228 22:27:31.333458 19244 layer_factory.hpp:77] Creating layer conv1-3
I0228 22:27:31.333465 19244 net.cpp:84] Creating Layer conv1-3
I0228 22:27:31.333472 19244 net.cpp:406] conv1-3 <- conv1-2
I0228 22:27:31.333477 19244 net.cpp:380] conv1-3 -> conv1-3
I0228 22:27:31.333492 19244 net.cpp:122] Setting up conv1-3
I0228 22:27:31.333498 19244 net.cpp:129] Top shape: 64 4 24 24 (147456)
I0228 22:27:31.333503 19244 net.cpp:137] Memory required for data: 2281472
I0228 22:27:31.333511 19244 layer_factory.hpp:77] Creating layer conv1-4
I0228 22:27:31.333518 19244 net.cpp:84] Creating Layer conv1-4
I0228 22:27:31.333524 19244 net.cpp:406] conv1-4 <- conv1-3
I0228 22:27:31.333530 19244 net.cpp:380] conv1-4 -> conv1
I0228 22:27:31.333544 19244 net.cpp:122] Setting up conv1-4
I0228 22:27:31.333550 19244 net.cpp:129] Top shape: 64 20 24 24 (737280)
I0228 22:27:31.333555 19244 net.cpp:137] Memory required for data: 5230592
I0228 22:27:31.333562 19244 layer_factory.hpp:77] Creating layer pool1
I0228 22:27:31.333569 19244 net.cpp:84] Creating Layer pool1
I0228 22:27:31.333575 19244 net.cpp:406] pool1 <- conv1
I0228 22:27:31.333580 19244 net.cpp:380] pool1 -> pool1
I0228 22:27:31.333588 19244 net.cpp:122] Setting up pool1
I0228 22:27:31.333595 19244 net.cpp:129] Top shape: 64 20 12 12 (184320)
I0228 22:27:31.333600 19244 net.cpp:137] Memory required for data: 5967872
I0228 22:27:31.333606 19244 layer_factory.hpp:77] Creating layer conv2
I0228 22:27:31.333613 19244 net.cpp:84] Creating Layer conv2
I0228 22:27:31.333618 19244 net.cpp:406] conv2 <- pool1
I0228 22:27:31.333626 19244 net.cpp:380] conv2 -> conv2
I0228 22:27:31.333780 19244 net.cpp:122] Setting up conv2
I0228 22:27:31.333787 19244 net.cpp:129] Top shape: 64 50 8 8 (204800)
I0228 22:27:31.333793 19244 net.cpp:137] Memory required for data: 6787072
I0228 22:27:31.333801 19244 layer_factory.hpp:77] Creating layer pool2
I0228 22:27:31.333808 19244 net.cpp:84] Creating Layer pool2
I0228 22:27:31.333814 19244 net.cpp:406] pool2 <- conv2
I0228 22:27:31.333819 19244 net.cpp:380] pool2 -> pool2
I0228 22:27:31.333827 19244 net.cpp:122] Setting up pool2
I0228 22:27:31.333834 19244 net.cpp:129] Top shape: 64 50 4 4 (51200)
I0228 22:27:31.333839 19244 net.cpp:137] Memory required for data: 6991872
I0228 22:27:31.333844 19244 layer_factory.hpp:77] Creating layer ip1
I0228 22:27:31.333853 19244 net.cpp:84] Creating Layer ip1
I0228 22:27:31.333858 19244 net.cpp:406] ip1 <- pool2
I0228 22:27:31.333864 19244 net.cpp:380] ip1 -> ip1
I0228 22:27:31.336172 19244 net.cpp:122] Setting up ip1
I0228 22:27:31.336184 19244 net.cpp:129] Top shape: 64 500 (32000)
I0228 22:27:31.336189 19244 net.cpp:137] Memory required for data: 7119872
I0228 22:27:31.336196 19244 layer_factory.hpp:77] Creating layer relu1
I0228 22:27:31.336203 19244 net.cpp:84] Creating Layer relu1
I0228 22:27:31.336210 19244 net.cpp:406] relu1 <- ip1
I0228 22:27:31.336215 19244 net.cpp:367] relu1 -> ip1 (in-place)
I0228 22:27:31.336222 19244 net.cpp:122] Setting up relu1
I0228 22:27:31.336228 19244 net.cpp:129] Top shape: 64 500 (32000)
I0228 22:27:31.336233 19244 net.cpp:137] Memory required for data: 7247872
I0228 22:27:31.336238 19244 layer_factory.hpp:77] Creating layer ip2
I0228 22:27:31.336244 19244 net.cpp:84] Creating Layer ip2
I0228 22:27:31.336249 19244 net.cpp:406] ip2 <- ip1
I0228 22:27:31.336254 19244 net.cpp:380] ip2 -> ip2
I0228 22:27:31.336294 19244 net.cpp:122] Setting up ip2
I0228 22:27:31.336300 19244 net.cpp:129] Top shape: 64 10 (640)
I0228 22:27:31.336305 19244 net.cpp:137] Memory required for data: 7250432
I0228 22:27:31.336311 19244 net.cpp:200] ip2 does not need backward computation.
I0228 22:27:31.336315 19244 net.cpp:200] relu1 does not need backward computation.
I0228 22:27:31.336320 19244 net.cpp:200] ip1 does not need backward computation.
I0228 22:27:31.336325 19244 net.cpp:200] pool2 does not need backward computation.
I0228 22:27:31.336330 19244 net.cpp:200] conv2 does not need backward computation.
I0228 22:27:31.336335 19244 net.cpp:200] pool1 does not need backward computation.
I0228 22:27:31.336340 19244 net.cpp:200] conv1-4 does not need backward computation.
I0228 22:27:31.336345 19244 net.cpp:200] conv1-3 does not need backward computation.
I0228 22:27:31.336350 19244 net.cpp:200] conv1-2 does not need backward computation.
I0228 22:27:31.336367 19244 net.cpp:200] conv1-1 does not need backward computation.
I0228 22:27:31.336372 19244 net.cpp:200] input does not need backward computation.
I0228 22:27:31.336377 19244 net.cpp:242] This network produces output ip2
I0228 22:27:31.336386 19244 net.cpp:255] Network initialization done.
I0228 22:27:31.338426 19244 upgrade_proto.cpp:53] Attempting to upgrade input file specified using deprecated V1LayerParameter: lenet/lenet.caffemodel
I0228 22:27:31.339273 19244 upgrade_proto.cpp:61] Successfully upgraded file specified using deprecated V1LayerParameter
I0228 22:27:31.339293 19244 net.cpp:744] Ignoring source layer mnist
I0228 22:27:31.339298 19244 net.cpp:744] Ignoring source layer conv1
I0228 22:27:31.339495 19244 net.cpp:744] Ignoring source layer loss
Traceback (most recent call last):
File "./lenet/main.py", line 17, in <module>
prepare_models(LAYER, R, NET_PATH, NET_NAME, INPUT_DIM)
File "/home/yizhou/test/cp-decomposition/config_processing.py", line 107, in prepare_models
w = net.layers[l].blobs[0].data
IndexError: Index out of range
What is cpd in matlab script?
matlab doesn't have this function
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.