A Preliminary Study of Ali Reasoning Framework MNN

MNN is a very powerful inference framework for deep learning models. The purpose is to accelerate inference. In order to make many larger models to be implemented, such as running on mobile phones

Need to prepare the environment before use

Here I am using it on linux

1. Environmental preparation

git clone https://github.com/alibaba/MNN.git
cd MNN
cd schema && ./generate.sh
mkdir build && cd build
cmake -DMNN_BUILD_DEMO=ON ..
make -j8

If you need to use the command globally, you can set the environment variable yourself, here I set it myself.

Note that this only at and above cmake3.0 version of the job, if you need to install can reference (tap), can be measured

When the environment is ready, we can test it

Before using it, it should be noted that mnn cannot directly use training and saved models such as TF/CAFFE/TFLITE. It needs to be converted and then used

mnn provides python mode conversion

Need to be installed in advance

pip install  MNN

2. Model conversion and use

1. Convert tflite model file

This is an image segmentation model

After downloading the model, you can convert it. I placed the model under the path /opt/modeldata/

mnnconvert -f TFLITE --modelFile /opt/modeldata/deeplabv3_257_mv_gpu.tflite --MNNModel deeplabv3_257_mv_gpu.mnn

Measure it:

segment.out deeplabv3_257_mv_gpu.mnn /opt/images/000000000139.jpg output.jpg

result:

2. Convert tf model file 

This is a model for pose prediction

mnnconvert -f TF --modelFile /opt/modeldata/model-mobilenet_v1_075.pb --MNNModel model-mobilenet_v1_075.mnn

have a test:

multiPose.out model-mobilenet_v1_075.mnn /opt/images/123.jpg output1.jpg

There is not much change in size before and after model conversion

 So far, it shows that the reasoning can be successfully carried out 

Main reference: https://www.yuque.com/mnn/cn/demo_project

Guess you like

Origin blog.csdn.net/zhou_438/article/details/108869477