Overtime is not secret: how to quickly integrate NCNN by AOE?

As our hair reserves of three programmers

Hair has always relied on a multi-line struggle in the first overtime

Always ask yourself the soul

Young people, how can you not work overtime?

Although I do not have a girlfriend
but I have the code it

But I do not understand is that the station next door, late for duty than me, than my work early every day to take off work a little quasi-girlfriend, work is also done a good way, of course, hair is also good. In addition to grow significantly older than me, did not he do what the holy grail? Taking advantage of the lunch break, at the expense of coffee a week, I steal the division of his holy grail. GET a secret, maybe I can be the cause of a double harvest of love.

 

Direct integration of the shortcomings NCNN

Direct integration NCNN boil old boys Yan wow, I remember when I burst into tears and integrated side, side to side with his girlfriend to his face Bubu SK2 (No, you do not, both-SK2 and his girlfriend) , Editor's Note children do, as SqueezeNet access NCNN, the relevant model file, NCNN header files and libraries, JNI calls, pre-processing and post-processing and other related business logic. These contents are placed in SqueezeNet Sample project. Such straightforward integration method, the problem is also apparent, and more traffic coupling, does not have versatility, both before and after processing and SqueezeNcnn For this Sample, not easily available to other service components used. A moment's thought, if we put the AI business as a separate AI component provides a service to the students to use, this happens:

Each component must contain NCNN dependence and libraries, and the development of each component of the students, have to go NCNN familiar interface, write the code to call C, and write JNI. So we naturally think of to extract the components of a NCNN out, after extracting it looks pleasing to the eye a lot, probably like this.

 

AOE SDK components in the NCNN

With AOE SDK, I can operate a meal fierce as a tiger! In AOE open source SDK, we provide NCNN components, let's talk about NCNN component from four aspects:

      ● Design NCNN components

      ● The transformation of SqueezeNet Sample

      ● How to access application components NCNN

      ● Some Thoughts on NCNN components

 

★ NCNN design components

Do not understand NCNN component design, the operation even if the meal fierce as a tiger, you might last only two five. What are its components le? Design NCNN component is a component is not included in the specific business logic, and the package contains only a call to NCNN interface. Specific business logic, implemented by the service side on the outside. On the interface definition and design, we refer to the TF Lite source code and interface design. External call interface currently available, long like this:

// load model and param 
void loadModelAndParam (...)
 // if initialization succeeds 
boolean isLoadModelSuccess ()
 // input data rgba 
void inputRgba (...)
 // reasoning 
void RUN (...)
 // multiple-input multiple output reasoning 
void runForMultipleInputsOutputs (...)
 // get the inference result 
Tensor getOutputTensor (...)
 // close and clean up the memory 
void use Close ()

The witty show in himself, using this:

├── AndroidManifest.xml
├── cpp
│ └── ncnn
│       ├── c_api_internal.h
│       ├── include
│       ├── interpreter.cpp
│       ├── Interpreter.h
│ ├── jni_util.cpp
│       ├── jni_utils.h
│       ├── nativeinterpreterwrapper_jni.cpp
│       ├── nativeinterpreterwrapper_jni.h
│       ├── tensor_jni.cpp
│       └── tensor_jni.h
├── java
With │ └──
│ └── didi
│           └── aoe
│               └── runtime
│ └── ncnn
│                       ├── Interpreter.java
│                       ├── NativeInterpreterWrapper.java
│                       └── Tensor.java
└── jniLibs
    ├── arm64-v8a
    │   └── libncnn.a
    Armeabi └── - V7a
        └── libncnn.a

 

    ● Interpreter, provided to external calls, providing a model loaded, these reasoning methods.

    ● NativeInterpreterWrapper is a specific category, which call for native.

    ● Tensor, mainly some of the data and the native interaction layer.

 

AOE NCNN with the good, the task is completed early, the mystery here.

     ● Support multiple input multiple output.

     ● Use ByteBuffer to improve efficiency.

     ● Use Object as input and output (ByteBuffer and practical support multidimensional arrays).

 

Handle and a spout lip off, AOE NCNN implementation process, listen to me carefully to have been.

 

how to support multiple input multiple output 

To support multiple-input and multiple-output, we have created a list of objects in a Tensor Native layers, each Tensor objects stored in the relevant input and output data. Tensor Native objects layer by layer calls tensor_jni available to java, java layer maintains the native point of layer tensor "pointer" address. So that multiple input and multiple output, as long as get the list of corresponding Tensor, you can manipulate the data of the line.

 

  use of ByteBuffer

The ByteBuffer, byte buffer processing sub section, the efficiency is higher than the conventional array.
DirectByteBuffer, using heap memory outside, eliminating the need to copy data to the core, the efficiency is higher than with ByteBuffer.

 

Of course ByteBuffer using the method is not to say that we focus on, we talk about the use ByteBuffer later, the benefits brought to us:
1. Interface in byte operation more convenient, such as inside putInt, getInt, putFloat, getFloat, flip and a series of interfaces, can easily operate on the data.
2. The native layer and do interact, use DirectByteBuffer, enhance efficiency. We can simply be appreciated that for a "share" memory operates directly java native layer and layer, reducing the intermediate byte copy process.

 

  How to use Object as input and output
Currently we only support the ByteBuffer and MultiDimensionalArray. In actual operation, if it is ByteBuffer, we will determine whether it is direct buffer, to different read and write operations. If MultiDimensionalArray, we will use different data types (e.g., int, float, etc.), dimensions and the like, to read and write operations on the data.

 

  the S transformation of queezeNet Sample

After integration AOE NCNN assembly, so SqueezeNet dependent NCNN Module, SqueezeNet Sample which contains only the model file, pre-processing and post-processing-related business logic, pre-processing and post-processing can be java, can also be used c be implemented by the particular business implementation to decide. The new code structure becomes very simple, the directory is as follows:

├── AndroidManifest.xml
├── assets
│   └── squeeze
│       ├── model.config
│       ├── squeezenet_v1.1.bin
│       ├── squeezenet_v1.1.id.h
│       ├── squeezenet_v1.1.param.bin
│       └── synset_words.txt
└── java
    └── with
        └── didi
            └── aoe
                └── features
                │       ├── squeezenet_v1.1.id.h
│       ├── squeezenet_v1.1.param.bin
│       └── synset_words.txt
└── java
    └── with
        └── didi  
         └── aoe
                └── features
                    └── squeeze
                        └── SqueezeInterpreter.java

↑  This Sample also applies to call other business components of NCNN AI components.

(Regressed on the finished thing)

  how to apply access into NCNN components

NCNN access assembly, there are two ways

     ● Direct access

     

 ● access by AOE SDK

Comparative ▲ two access modes:

BATTLE no, I unilaterally declared, AOE SDK victory!

 

  summarize and thinking NCNN components

By NCNN package of components, integrated NCNN business now faster and more convenient. Before we integrated a new business NCNN, it may take half a day to a day. After use AOE NCNN assembly may only require 1-2 hours. Of course NCNN components currently there are still many imperfections, we need to go deeper learning and understanding of NCNN. Will be back, continuing to NCNN components transformation and optimization through continuous learning.

 - - - - - - - - - - - - - - - - - - - - - - - - - - - A o E - - - - - - -  - - - - - - - - - - - - - - - - - - - -

 

原创不易,欢迎打赏

                  https://github.com/didi/AoE←据说点了这里的程序员们都准点下班了/

 

 

Guess you like

Origin www.cnblogs.com/didichuxing/p/11910831.html