Comments (10)
Hi @xhlulu
Thank you very much for the question. Sorry, we do not have the document (yet).
Besides the tips from @nunescoelho , for now, you may follow the example/example_keras_to_qkeras.py to convert model to its quantized counterpart. For example, the model_quantize usage
And use qmodel.load_weights("Your_weight_file")
just like what people usually do.
If you want to see the weights, you might want to try
model_save_quantized_weights from here
Line 1503 in 92ec6d3
for layer in qmodel.layers:
try:
if layer.get_quantizers():
q_w_pairs = zip(layer.get_quantizers(), layer.get_weights())
for _, (quantizer, weight) in enumerate(q_w_pairs):
qweight = K.eval(quantizer(weight))
print("quantized weight")
print(qweight)
except AttributeError:
print("warning, the weight is not quantized in the layer %s", layer.name)
@nunescoelho it could be helpful that we can document the process.
from qkeras.
Remember that QKeras layers only change the behavior of the forward pass (that's the straight through estimator), so look at the functions:
- model_save_quantized_weights
- model_quantize
In particular, model_quantize at end transfers the weights from the original keras model to the quantized model (optionally).
And remember as a rule of thumb, every arithmetic operation needs to be followed by a quantizer if you want the QKeras model to mimic any implementation.
from qkeras.
I was also applying QKeras to a pretrained model, but unfortunately my validation accuracy was very low after using model_quantize.
Is Retraining necessary, so can QKeras only be used for quantization-aware training? Because it seems to me that loading a pretrained model and quantizing it (post-training quantization) does not work.
And is there any documentation on using QKeras for pretrained models so far?
Best regards and many thanks,
asti205
from qkeras.
@asti205 retraining is necessary or you can try to train from qkeras directly by modifying your keras model into qkeras version.
from qkeras.
from qkeras.
Hello @nunescoelho ,
thank you for the explanation, but that is clear to me :)
The reason why I was asking is, that I could get a validation accuracy that was magnitudes higher using the TFLite converter. So I was explicitly interested in post-training quantization. However, I could also get a much higher accuracy with an own quantizer that is approximating the TFLite-converter behaviour.
Also, I think it is not the best option to set the post-comma bitwidths fixed, it is better to quantize adaptively depending on the actual weights. At least that is what I found out from the analysis of the TFLite Converter.
Best regards,
asti205
from qkeras.
hey, i want to know. in our win10, i ues pip install qkeras error.i want to install qkeras package what should i do ?
from qkeras.
Hi @Sejudyblues thank you for the question!
I assume you could find the package via pip install qkeras, right?
Could you try to clone the repo and use python setup.py install to install this package? BTW, we have not tried in win10.
from qkeras.
Since it has been quite for a while, I closed this issue. Feel free to reopen it.
from qkeras.
Hi, I looking for a tool for post-training inference with custom option(like bit-width). I use tensorflow and keras for my project, I trained a model and pruned it with tfmot, then I want to quantize it to a full integer model. Since retraining will change the sparsity of the model, I don't want to use quantization-aware-training. The other way is tf-lite, but it doesn't support quantize with custom option. Finally, I look for this project. Can I use this project to solve my problem? And I learned that full integer quantization needs represent dataset to learn some thing for activation quantization, does any functions in this project can do that? Or can I change source code to eliminate back-propagation in training and just use training to get parameter for activation quantization.
Thanks!
from qkeras.
Related Issues (20)
- Very low accuracy following AutoQKeras notebook and CUDA error HOT 1
- How low precision weights and biases are stored in QKeras? HOT 4
- When I use QKeras: Failed to load in-memory CUBIN: CUDA_ERROR_NO_BINARY_FOR_GPU: no kernel image is available for execution on the device [Op:Abs] HOT 2
- Params not quantized after model_save_quantized_weights function HOT 4
- Only Qconv layer's output tensors are quantized
- Cannot convert 6.0 to EagerTensor of dtype int64
- How do I save an AutoQKeras model that a different script can load?
- `pyparser` vs `pyparsing`
- Can QKeras support Full integer quantization HOT 8
- Add a custom layer with a bitwise operation
- TFLite compatibility
- How can I get the scale of the QAdaptiveActivation layer
- How do keras and qkeras versions correspond to each other?
- Difference between Qkeras model and Keras model HOT 9
- Adding QKeras to conda forge
- TypeError: Could not locate class 'QConv2D'. HOT 4
- Attribute `__name__` missing from QActivation layer
- Unpredictable quantization with quantized_bits and an alpha value of None
- QKeras fails due to missing modules and numpy error message with latest TensorFlow version 2.16.1
- Upgrade to keras v3?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from qkeras.