Git Product home page Git Product logo

dnnlibrary's Introduction

DNNLibrary

Download PRs Welcome

Run ONNX models on your Android phone using the new NNAPI !

Android 8.1 introduces Neural Networks API (NNAPI). It's very exciting to run a model in the "native" way supported by Android System. :)

DNNLibrary is a wrapper of NNAPI ("DNNLibrary" is for "daquexian's NNAPI library). It lets you easily make the use of the new NNAPI introduced in Android 8.1. You can convert your onnx model into daq and run the model directly.

For the Android app example, please check out dnnlibrary-example.

Telegram Group: link, QQ Group (Chinese): 948989771, answer: 哈哈哈哈

Screenshot

This screenshot is ResNet-18

Screenshot image resnet

This screenshot is LeNet

Screenshot camera mnist

Preparation

Clone this repo and submodules:

git clone --recursive https://github.com/daquexian/DNNLibrary

Please make sure the Android System on your phone is 8.1+, or you may want to use an 8.1+ emulator.

Introduction

Android 8.1 introduces NNAPI. However, NNAPI is not friendly to normal Android developers. It is not designed to be used by normal developers directly. So I wrapped it into a library.

With DNNLibrary it's extremely easy to deploy your ONNX model on Android 8.1+ phone. For example, following is the Java code to deploy the MobileNet v2 in your app (please check out daq-example for detail):

ModelBuilder modelBuilder = new ModelBuilder();
Model model = modelBuilder.readFile(getAssets(), "mobilenetv2.daq")
                        .setOutput("mobilenetv20_output_pred_fwd"); // The output name is from the onnx model
                        .compile(ModelBuilder.PREFERENCE_FAST_SINGLE_ANSWER);

float[] result = model.predict(inputData);

Only five lines! And the daq model file is got from the pretrained onnx model using onnx2daq.

Convert the model

After cloning step listed in Preparation section, run

mkdir build
cd build
cmake ..
cmake --build .

Now onnx2daq is in tools/onnx2daq directory. The following command is to convert onnx model to daq model.

./tools/onnx2daq/onnx2daq <onnx model> <output filename>

For example, if you have a model named "mobilenetv2.onnx" in your current directory,

./tools/onnx2daq/onnx2daq mobilenetv2.onnx mobilenetv2.daq

Usage

If you are an Android app developer and want it to work out of the box

Welcome! It has been published on jcenter.

Add

implementation 'me.daquexian:dnnlibrary:0.5.3'

(for Gradle 3.0+),

or

compile 'me.daquexian:dnnlibrary:0.5.3'

(for Gradle lower than 3.0)

in your app's build.gradle's dependencies section.

If you don't care about Android app

We use CMake as the build system. So you can build it as most C++ projects, the only difference is that you need Android NDK, r17b or higher NDK is necessary :

mkdir build && cd build
cmake -DCMAKE_SYSTEM_NAME=Android -DCMAKE_ANDROID_NDK=<path of android ndk> -DCMAKE_ANDROID_ARCH_ABI=arm64-v8a -DCMAKE_ANDROID_NDK_TOOLCHAIN_VERSION=clang -DCMAKE_ANDROID_STL_TYPE=c++_static -DCMAKE_SYSTEM_VERSION=<Android API level, 27 or 28> ..
cmake --build .

then you will get binary files.

But TensorFlow Lite also supports NNAPI...

Yes, but its support for NNAPI is far from perfect. For example, dilated convolution (which is widely used in segmentation) are not supported (https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/nnapi_delegate.cc#L458).

What's more, only the models got from TensorFlow can easily get converted to TF Lite model. Since NNAPI is independent of any frameworks, we support ONNX, a framework-independent model format.

_ TF Lite DNNLibrary
Supported Model Format TensorFlow ONNX
Dilated Convolution ✔️
Ease of Use
(Bazel build system,
not friendly to Android developers)
✔️
Quantization ✔️
(WIP, plan to base on this)

However we are also far from maturity comparing to TF Lite. At least we are an another choice if you want to enjoy the power of NNAPI :)

Benchmark

Google Pixel, Android 9.0

model time
MobileNet v2 132.95ms
SqueezeNet v1.1 80.80ms

More benchmark is welcome!

About caffe model support

The old DNNLibrary supports caffe model by dnntools, however, it is not supported directly now, the models generated by dnntools are not usable, too. Please use a convert tool like MMdnn to convert the caffe model to the ONNX model, then convert it to daq using onnx2daq.

dnnlibrary's People

Contributors

daquexian avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.