adlik / adlik Goto Github PK
View Code? Open in Web Editor NEWAdlik: Toolkit for Accelerating Deep Learning Inference
License: Apache License 2.0
Adlik: Toolkit for Accelerating Deep Learning Inference
License: Apache License 2.0
Adlik does compression optimization for the model, but ONNX Runtime, Baidu's PaddleSlim and others all have similar functions, so what's the difference between them?
Support Benchmark test :
Add the following CI checks:
Adlik should support the framework to optimize deep-learned model, such as tensorflow checkpoint.
The optimizing techniques include:
Considering two scenario:
When the model compiler creates config.pbtxt, if the Number dimension of the input and output shapes is not None but a specific value, the created config.pbtxt cannot be inferred.
ServingLite runtime framework should support Deep Learning model running in FPGA.
1.'graph conversoin' should be 'graph conversion'.
2.'OP combinaton' should be 'OP combination'.
For a model which has 3 versions: 1, 2 and 3, when adlik_serving starts it doesn't load version 3 but load version 2.
Different runtimes have their own definition for model config
"" Build serving with OpenVINO runtime
Install intel-openvino-ie-rt-core package from OpenVINO.""
Is a installation process or a building process?
ML runtime support http req.
The introduction part of adlik adds a little more description of application scenarios and advantages over other platforms
When i compile the onnx model to OpenVINO or TensorRT model , the error is as follows:
TypeError: expected str, bytes or os.PathLike object, not NoneType {'status': 'failure', 'error_msg': 'expected str, bytes or os.PathLike object, not NoneType'}
But there is an onnx model in the directory of input_model.
When i use the tensorrt 7 to compile the onnx model, raise parser error.
User usually get model meta which described in "model.pbtxt" to construct input before do prediction. But prediction will fail if information in "model.pbtxt" and model representation not consistent.
When I compile the model to OpenVINO type, the error is as follows:
AttributeError: module 'tensorflow' has no attribute 'NodeDef'
Add markdownlint to CI for checking Markdown files.
The function “exit” does not belong to the list of async-signal-safe functions.
I guess that a different program design will be needed for your function “handler”.
Monitor the server's status which includes power on status and running status
Support redirect log to file
Add benchmark test result with tf lite runtime.
Model is Resnet50.
Support query model information include version, path, whether activated and so on
Run tensorRT model failed because some codes about cuda not be compiled into binary.
Support update model (but not activate) via grpc/http interface
TensorFlow runtime doesn't enable any batch scheduler and can't batch multiple requests into one.
You can build Adlik inside the Docker image.
Image is not runtime, so you can't compile Adlik inside image. There are two ways to use the image:
Related: #152.
There is a bug in line 326 of model_loader.py in model_compiler
if len(self.input_formats) < len(self.input_names):
self.input_formats.extend([None for _ in range(len(self.input_formats), len(self.input_formats))])
The second line should start with “Model Compiler”.
Currently, there are some bad designs in the model compiler. We should come up with a new design.
After installing all of Adlik's packages and the build is complete, there should be a sample test to see if Adlik was successfully installed, similar to the "Hello world!" test of other software.
When I use the docker images which built by ci/docker/build.sh file to compile the openvino model, the error message is as follows:
{'status': 'failure', 'error_msg': 'mo.py does not exist, path: /opt/intel/openvino_2019.3.344/deployment_tools/model_optimizer/mo.py'}
The "ModelOperateImpl::activateModel" function is too long and into chaos
What is the meaning of “Adlik”? How to pronounce it? Add these information to README.md
file.
Support activate a specific model ( or version) via grpc/http interface
To complete model compiler scenario.
Model compiler should support compiling framework for the Deep Learning model running in FPGA.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.