leonardo-blanger / detr_tensorflow Goto Github PK
View Code? Open in Web Editor NEWA Tensorflow implementation of the DETR object detection architecture.
License: MIT License
A Tensorflow implementation of the DETR object detection architecture.
License: MIT License
Hello,
I am very interesting with DETR model and want to run it in c++. So I plan to convert your model into tflite format.
First, I load your pretrained weight and build it with custom inputshape (1, 320, 320, 3)
.
But I cannot convert the model to tflite format, the error is:
ValueError: Model <detr_tensorflow.models.detr.DETR object at 0x7f0620399f10> cannot be saved because the input shapes have not been set. Usually, input shapes are automatically determined from calling `.fit()` or `.predict()`. To manually set the shapes, call `model.build(input_shape)`
My code is as belows:
import tensorflow as tf
from detr_tensorflow.models.default import build_detr_resnet50
detr = build_detr_resnet50(num_classes=91) # 91 classes for the COCO dataset
detr.build([(1, 320, 320, 3), (None, None, None)])
detr.load_weights("detr-r50-e632da11.h5")
converter = tf.lite.TFLiteConverter.from_keras_model(detr)
tflite_model = converter.convert()
# Save the model.
with open('detr-r50-e632da11.tflite', 'wb') as f:
f.write(tflite_model)
How can I resolve this issue? Can anyone help me in this case?
Thank you very much!
Hello. It is mentioned in How to Use Section that we can download weights from a link of Google drive. However, that Google drive folder is empty. Could you please let me know how can I have access to the weights files?
I am also ML enthusiast who came through this paper and was willing to replicate this entire behavior in tensorflow. Fortunately I even completed the project but due to lack of my knowledge I am not able to put this last piece which resides at the heart of DETR and that is implementation of its loss calculation. I tried a lot of ways but that particular this is one which I am not able to implement using tensorflow. So if you can help me with that then it would be so great of you.
You can contact me on my email if you want to see my implementation. My email id is: [email protected]
Trying to save the model using tf.keras.models.save_model leads to a TypeError:
File "detr_tf/networks/transformer.py", line 211, in call
key_mem = memory + pos_encoding
TypeError: unsupported operand type(s) for +: 'NoneType' and 'NoneType'
Put the following code below import tensorflow as tf at demo.py
config = tf.compat.v1.ConfigProto()
config.gpu_options.allow_growth = True
session = tf.compat.v1.Session(config=config)
otherwise you will get E tensorflow/stream_executor/cuda/cuda_blas.cc:226] failed to create cublas handle: CUBLAS_STATUS_NOT_INITIALIZED
Thank you for this work.
I have a question rather than an issue. I have trained a DETR model using the PyTorch implementation, and now I would like to convert the model into a TF model. I think your implementation allows this, except that I am not sure how you converted the weights to a .h5 file.
Would you please shed some light on this? Thank you.
When I try to load the converted pickle weights I get errors on some layers. I tried to skip those weights, but without them the network is not capable of returning good results.
I am using tensorflow-gpu=2.3.0-rc1
and torch==1.6.0+cpu
.
I made some slight changes on the original weight loader to prompt the layers that raise the errors:
Loading backbone/0/body/conv1/kernel
Loading backbone/0/body/layer1/0/conv1/kernel
Loading backbone/0/body/layer1/0/conv2/kernel
Loading backbone/0/body/layer1/0/conv3/kernel
Loading 0/kernel > Error!
Loading backbone/0/body/layer1/1/conv1/kernel
Loading backbone/0/body/layer1/1/conv2/kernel
Loading backbone/0/body/layer1/1/conv3/kernel
Loading backbone/0/body/layer1/2/conv1/kernel
Loading backbone/0/body/layer1/2/conv2/kernel
Loading backbone/0/body/layer1/2/conv3/kernel
Loading backbone/0/body/layer2/0/conv1/kernel
Loading backbone/0/body/layer2/0/conv2/kernel
Loading backbone/0/body/layer2/0/conv3/kernel
Loading 0/kernel > Error!
Loading backbone/0/body/layer2/1/conv1/kernel
Loading backbone/0/body/layer2/1/conv2/kernel
Loading backbone/0/body/layer2/1/conv3/kernel
Loading backbone/0/body/layer2/2/conv1/kernel
Loading backbone/0/body/layer2/2/conv2/kernel
Loading backbone/0/body/layer2/2/conv3/kernel
Loading backbone/0/body/layer2/3/conv1/kernel
Loading backbone/0/body/layer2/3/conv2/kernel
Loading backbone/0/body/layer2/3/conv3/kernel
Loading backbone/0/body/layer3/0/conv1/kernel
Loading backbone/0/body/layer3/0/conv2/kernel
Loading backbone/0/body/layer3/0/conv3/kernel
Loading 0/kernel > Error!
Loading backbone/0/body/layer3/1/conv1/kernel
Loading backbone/0/body/layer3/1/conv2/kernel
Loading backbone/0/body/layer3/1/conv3/kernel
Loading backbone/0/body/layer3/2/conv1/kernel
Loading backbone/0/body/layer3/2/conv2/kernel
Loading backbone/0/body/layer3/2/conv3/kernel
Loading backbone/0/body/layer3/3/conv1/kernel
Loading backbone/0/body/layer3/3/conv2/kernel
Loading backbone/0/body/layer3/3/conv3/kernel
Loading backbone/0/body/layer3/4/conv1/kernel
Loading backbone/0/body/layer3/4/conv2/kernel
Loading backbone/0/body/layer3/4/conv3/kernel
Loading backbone/0/body/layer3/5/conv1/kernel
Loading backbone/0/body/layer3/5/conv2/kernel
Loading backbone/0/body/layer3/5/conv3/kernel
Loading backbone/0/body/layer4/0/conv1/kernel
Loading backbone/0/body/layer4/0/conv2/kernel
Loading backbone/0/body/layer4/0/conv3/kernel
Loading 0/kernel > Error!
Loading backbone/0/body/layer4/1/conv1/kernel
Loading backbone/0/body/layer4/1/conv2/kernel
Loading backbone/0/body/layer4/1/conv3/kernel
Loading backbone/0/body/layer4/2/conv1/kernel
Loading backbone/0/body/layer4/2/conv2/kernel
Loading backbone/0/body/layer4/2/conv3/kernel
Loading transformer/encoder/layers/0/self_attn/in_proj_weight
Loading transformer/encoder/layers/0/self_attn/in_proj_bias
Loading transformer/encoder/layers/0/self_attn/out_proj/kernel
Loading transformer/encoder/layers/0/self_attn/out_proj/bias
Loading transformer/encoder/layers/0/linear1/kernel
Loading transformer/encoder/layers/0/linear1/bias
Loading transformer/encoder/layers/0/linear2/kernel
Loading transformer/encoder/layers/0/linear2/bias
Loading transformer/encoder/layers/0/norm1/gamma
Loading transformer/encoder/layers/0/norm1/beta
Loading transformer/encoder/layers/0/norm2/gamma
Loading transformer/encoder/layers/0/norm2/beta
Loading transformer/encoder/layers/1/self_attn/in_proj_weight
Loading transformer/encoder/layers/1/self_attn/in_proj_bias
Loading transformer/encoder/layers/1/self_attn/out_proj/kernel
Loading transformer/encoder/layers/1/self_attn/out_proj/bias
Loading transformer/encoder/layers/1/linear1/kernel
Loading transformer/encoder/layers/1/linear1/bias
Loading transformer/encoder/layers/1/linear2/kernel
Loading transformer/encoder/layers/1/linear2/bias
Loading transformer/encoder/layers/1/norm1/gamma
Loading transformer/encoder/layers/1/norm1/beta
Loading transformer/encoder/layers/1/norm2/gamma
Loading transformer/encoder/layers/1/norm2/beta
Loading transformer/encoder/layers/2/self_attn/in_proj_weight
Loading transformer/encoder/layers/2/self_attn/in_proj_bias
Loading transformer/encoder/layers/2/self_attn/out_proj/kernel
Loading transformer/encoder/layers/2/self_attn/out_proj/bias
Loading transformer/encoder/layers/2/linear1/kernel
Loading transformer/encoder/layers/2/linear1/bias
Loading transformer/encoder/layers/2/linear2/kernel
Loading transformer/encoder/layers/2/linear2/bias
Loading transformer/encoder/layers/2/norm1/gamma
Loading transformer/encoder/layers/2/norm1/beta
Loading transformer/encoder/layers/2/norm2/gamma
Loading transformer/encoder/layers/2/norm2/beta
Loading transformer/encoder/layers/3/self_attn/in_proj_weight
Loading transformer/encoder/layers/3/self_attn/in_proj_bias
Loading transformer/encoder/layers/3/self_attn/out_proj/kernel
Loading transformer/encoder/layers/3/self_attn/out_proj/bias
Loading transformer/encoder/layers/3/linear1/kernel
Loading transformer/encoder/layers/3/linear1/bias
Loading transformer/encoder/layers/3/linear2/kernel
Loading transformer/encoder/layers/3/linear2/bias
Loading transformer/encoder/layers/3/norm1/gamma
Loading transformer/encoder/layers/3/norm1/beta
Loading transformer/encoder/layers/3/norm2/gamma
Loading transformer/encoder/layers/3/norm2/beta
Loading transformer/encoder/layers/4/self_attn/in_proj_weight
Loading transformer/encoder/layers/4/self_attn/in_proj_bias
Loading transformer/encoder/layers/4/self_attn/out_proj/kernel
Loading transformer/encoder/layers/4/self_attn/out_proj/bias
Loading transformer/encoder/layers/4/linear1/kernel
Loading transformer/encoder/layers/4/linear1/bias
Loading transformer/encoder/layers/4/linear2/kernel
Loading transformer/encoder/layers/4/linear2/bias
Loading transformer/encoder/layers/4/norm1/gamma
Loading transformer/encoder/layers/4/norm1/beta
Loading transformer/encoder/layers/4/norm2/gamma
Loading transformer/encoder/layers/4/norm2/beta
Loading transformer/encoder/layers/5/self_attn/in_proj_weight
Loading transformer/encoder/layers/5/self_attn/in_proj_bias
Loading transformer/encoder/layers/5/self_attn/out_proj/kernel
Loading transformer/encoder/layers/5/self_attn/out_proj/bias
Loading transformer/encoder/layers/5/linear1/kernel
Loading transformer/encoder/layers/5/linear1/bias
Loading transformer/encoder/layers/5/linear2/kernel
Loading transformer/encoder/layers/5/linear2/bias
Loading transformer/encoder/layers/5/norm1/gamma
Loading transformer/encoder/layers/5/norm1/beta
Loading transformer/encoder/layers/5/norm2/gamma
Loading transformer/encoder/layers/5/norm2/beta
Loading transformer/decoder/layers/0/self_attn/in_proj_weight
Loading transformer/decoder/layers/0/self_attn/in_proj_bias
Loading transformer/decoder/layers/0/self_attn/out_proj/kernel
Loading transformer/decoder/layers/0/self_attn/out_proj/bias
Loading transformer/decoder/layers/0/multihead_attn/in_proj_weight
Loading transformer/decoder/layers/0/multihead_attn/in_proj_bias
Loading transformer/decoder/layers/0/multihead_attn/out_proj/kernel
Loading transformer/decoder/layers/0/multihead_attn/out_proj/bias
Loading transformer/decoder/layers/0/linear1/kernel
Loading transformer/decoder/layers/0/linear1/bias
Loading transformer/decoder/layers/0/linear2/kernel
Loading transformer/decoder/layers/0/linear2/bias
Loading transformer/decoder/layers/0/norm1/gamma
Loading transformer/decoder/layers/0/norm1/beta
Loading transformer/decoder/layers/0/norm2/gamma
Loading transformer/decoder/layers/0/norm2/beta
Loading transformer/decoder/layers/0/norm3/gamma
Loading transformer/decoder/layers/0/norm3/beta
Loading transformer/decoder/layers/1/self_attn/in_proj_weight
Loading transformer/decoder/layers/1/self_attn/in_proj_bias
Loading transformer/decoder/layers/1/self_attn/out_proj/kernel
Loading transformer/decoder/layers/1/self_attn/out_proj/bias
Loading transformer/decoder/layers/1/multihead_attn/in_proj_weight
Loading transformer/decoder/layers/1/multihead_attn/in_proj_bias
Loading transformer/decoder/layers/1/multihead_attn/out_proj/kernel
Loading transformer/decoder/layers/1/multihead_attn/out_proj/bias
Loading transformer/decoder/layers/1/linear1/kernel
Loading transformer/decoder/layers/1/linear1/bias
Loading transformer/decoder/layers/1/linear2/kernel
Loading transformer/decoder/layers/1/linear2/bias
Loading transformer/decoder/layers/1/norm1/gamma
Loading transformer/decoder/layers/1/norm1/beta
Loading transformer/decoder/layers/1/norm2/gamma
Loading transformer/decoder/layers/1/norm2/beta
Loading transformer/decoder/layers/1/norm3/gamma
Loading transformer/decoder/layers/1/norm3/beta
Loading transformer/decoder/layers/2/self_attn/in_proj_weight
Loading transformer/decoder/layers/2/self_attn/in_proj_bias
Loading transformer/decoder/layers/2/self_attn/out_proj/kernel
Loading transformer/decoder/layers/2/self_attn/out_proj/bias
Loading transformer/decoder/layers/2/multihead_attn/in_proj_weight
Loading transformer/decoder/layers/2/multihead_attn/in_proj_bias
Loading transformer/decoder/layers/2/multihead_attn/out_proj/kernel
Loading transformer/decoder/layers/2/multihead_attn/out_proj/bias
Loading transformer/decoder/layers/2/linear1/kernel
Loading transformer/decoder/layers/2/linear1/bias
Loading transformer/decoder/layers/2/linear2/kernel
Loading transformer/decoder/layers/2/linear2/bias
Loading transformer/decoder/layers/2/norm1/gamma
Loading transformer/decoder/layers/2/norm1/beta
Loading transformer/decoder/layers/2/norm2/gamma
Loading transformer/decoder/layers/2/norm2/beta
Loading transformer/decoder/layers/2/norm3/gamma
Loading transformer/decoder/layers/2/norm3/beta
Loading transformer/decoder/layers/3/self_attn/in_proj_weight
Loading transformer/decoder/layers/3/self_attn/in_proj_bias
Loading transformer/decoder/layers/3/self_attn/out_proj/kernel
Loading transformer/decoder/layers/3/self_attn/out_proj/bias
Loading transformer/decoder/layers/3/multihead_attn/in_proj_weight
Loading transformer/decoder/layers/3/multihead_attn/in_proj_bias
Loading transformer/decoder/layers/3/multihead_attn/out_proj/kernel
Loading transformer/decoder/layers/3/multihead_attn/out_proj/bias
Loading transformer/decoder/layers/3/linear1/kernel
Loading transformer/decoder/layers/3/linear1/bias
Loading transformer/decoder/layers/3/linear2/kernel
Loading transformer/decoder/layers/3/linear2/bias
Loading transformer/decoder/layers/3/norm1/gamma
Loading transformer/decoder/layers/3/norm1/beta
Loading transformer/decoder/layers/3/norm2/gamma
Loading transformer/decoder/layers/3/norm2/beta
Loading transformer/decoder/layers/3/norm3/gamma
Loading transformer/decoder/layers/3/norm3/beta
Loading transformer/decoder/layers/4/self_attn/in_proj_weight
Loading transformer/decoder/layers/4/self_attn/in_proj_bias
Loading transformer/decoder/layers/4/self_attn/out_proj/kernel
Loading transformer/decoder/layers/4/self_attn/out_proj/bias
Loading transformer/decoder/layers/4/multihead_attn/in_proj_weight
Loading transformer/decoder/layers/4/multihead_attn/in_proj_bias
Loading transformer/decoder/layers/4/multihead_attn/out_proj/kernel
Loading transformer/decoder/layers/4/multihead_attn/out_proj/bias
Loading transformer/decoder/layers/4/linear1/kernel
Loading transformer/decoder/layers/4/linear1/bias
Loading transformer/decoder/layers/4/linear2/kernel
Loading transformer/decoder/layers/4/linear2/bias
Loading transformer/decoder/layers/4/norm1/gamma
Loading transformer/decoder/layers/4/norm1/beta
Loading transformer/decoder/layers/4/norm2/gamma
Loading transformer/decoder/layers/4/norm2/beta
Loading transformer/decoder/layers/4/norm3/gamma
Loading transformer/decoder/layers/4/norm3/beta
Loading transformer/decoder/layers/5/self_attn/in_proj_weight
Loading transformer/decoder/layers/5/self_attn/in_proj_bias
Loading transformer/decoder/layers/5/self_attn/out_proj/kernel
Loading transformer/decoder/layers/5/self_attn/out_proj/bias
Loading transformer/decoder/layers/5/multihead_attn/in_proj_weight
Loading transformer/decoder/layers/5/multihead_attn/in_proj_bias
Loading transformer/decoder/layers/5/multihead_attn/out_proj/kernel
Loading transformer/decoder/layers/5/multihead_attn/out_proj/bias
Loading transformer/decoder/layers/5/linear1/kernel
Loading transformer/decoder/layers/5/linear1/bias
Loading transformer/decoder/layers/5/linear2/kernel
Loading transformer/decoder/layers/5/linear2/bias
Loading transformer/decoder/layers/5/norm1/gamma
Loading transformer/decoder/layers/5/norm1/beta
Loading transformer/decoder/layers/5/norm2/gamma
Loading transformer/decoder/layers/5/norm2/beta
Loading transformer/decoder/layers/5/norm3/gamma
Loading transformer/decoder/layers/5/norm3/beta
Loading transformer/decoder/norm/gamma
Loading transformer/decoder/norm/beta
Loading input_proj/kernel
Loading input_proj/bias
Loading class_embed/kernel
Loading class_embed/bias
Loading layers/0/kernel > Error!
Loading layers/0/bias > Error!
Loading layers/1/kernel > Error!
Loading layers/1/bias > Error!
Loading layers/2/kernel > Error!
Loading layers/2/bias > Error!
Loading query_embed/kernel
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.