Comments (3)
Yeah, I have been puzzled by this in the past as well. The not completely satisfying answer is that TfLite and TFLM currently implicitly assume that ints are 32 bit.
In this particular case we have two different functions that expect per_channel_output_shift to be an int* and an int32_t*
tflite-micro/tensorflow/lite/micro/kernels/depthwise_conv.cc
Lines 71 to 73 in 68dd0cf
tflite-micro/tensorflow/lite/micro/kernels/depthwise_conv_common.cc
Lines 111 to 117 in f583f92
tflite-micro/tensorflow/lite/kernels/kernel_util.cc
Lines 210 to 215 in 68dd0cf
from tflite-micro.
I suppose 32 bit integers are a fairly safe assumption, but it would be nice to make the type names consistent.
Would you be open to a patch to make the PopulateConvolutionQuantizationParams() per_channel_shift parameter an int32_t* , and remove the reinterpret_cast<>s ?
from tflite-micro.
Yeah, that's going to help. The change to PopulateConvolutionQuantizationParams will have to go in the tensorflow/tensorflow repo and it will then get synced to the current repo automatically.
from tflite-micro.
Related Issues (20)
- Type 'INT16' is not supported by gather HOT 2
- Didn't find op for builtin opcode 'CONV_2D' version '3'. An older version of this builtin might be supported. Are you using an old TFLite binary with a newer model? HOT 5
- TFLM compression changes
- build: C++17 is not enabled on all platforms HOT 3
- log32 overflow ? HOT 2
- TFLM compression changes (2nd)
- TFLITE micro ERROR An error occurred during the fetch of repository 'tflm_pip_deps_numpy' HOT 2
- TFLM compression changes (3rd)
- Impossible to compile with recent GCC HOT 1
- TFLM implementation in Contiki-NG HOT 1
- TFLM compression changes (4th) HOT 1
- Cortex-M CI Run Failed HOT 3
- Quantisize LSTM Model HOT 5
- Python dependency issue for aarch64 HOT 1
- While OP Issue in TFLM
- Custom memory allocation/planning that may not be that custom HOT 3
- Does TFLM support MFCC? HOT 2
- Training person detection model. HOT 1
- Add int16 support to MINIMUM and MAXIMUM HOT 2
- Fail to run person_detect.tflite via TFLite benchmark_model HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from tflite-micro.