Install MNN deep learning framework on Jetson Nano

1 Introduction

This page will guide you to install Alibaba's MNN framework on Jetson Nano. In the latest version, the MNN framework also supports CUDA. This makes it ideal for Jetson Nano as a lightweight framework. The C++ code examples given were written in Nano's code::Blocks IDE. We only guide you through the basics, so eventually you can build your own app. For more information on the MNN library, see the documentation here . Maybe not necessary, but the C++ version is installed, not suitable for Python.

2. Dependence

The MNN framework has some dependencies. It requires protobuf. OpenCV was used to build the C++ examples, and MNN does not require OpenCV. The steps to install MNN on Jetson Nano with Linux Tegra operating system are as follows.

# check for updates
$ sudo apt-get update
$ sudo apt-get upgrade
# install dependencies
$ sudo apt-get install cmake wget
$ sudo apt-get install libprotobuf-dev protobuf-compiler
$ sudo apt-get install libglew-dev

Before compiling the MNN software, there is one thing to do. The MNN Vulkan interface uses the OpenGL ES 3.0 library. It is a low-level graphics rendering interface for Android. Fortunately, it is backward compatible with the 2.0 version of the library in JetPack 4.4 on the Jetson Nano. Also, to the best of our knowledge, the MNN framework does not use any unique 3.0 version calls. It makes it possible to redirect libGLES3.0 to libGLES2.0 using symlinks. This strategy works really well and frees you from the tedious process of installing version 3.0.

# make symlink
$ sudo ln -s /usr/lib/aarch64-linux-gnu/libGLESv2.so /usr/lib/aarch64-linux-gnu/libGLESv3.so

3. Install

Once the dependencies are installed, it's time to build the library.

# download MNN
$ git clone https://github.com/alibaba/MNN.git
# common preparation (installing the flatbuffers)
$ cd MNN
$ ./schema/generate.sh
# install MNN
$ mkdir build
$ cd build
# generate build script
$ cmake -D CMAKE_BUILD_TYPE=Release \
        -D MNN_BUILD_QUANTOOLS=ON \
        -D MNN_BUILD_CONVERTER=ON \
       -D MNN_OPENGL=ON \
       -D MNN_VULKAN=ON \
       -D MNN_CUDA=ON \
        -D MNN_TENSORRT=OFF \
        -D MNN_BUILD_DEMO=ON \
        -D MNN_BUILD_BENCHMARK=ON ..

CMake MNN Jetson

Time to build the library and install it into the appropriate folder.

# build MNN (± 25 min)
$ make -j4
$ sudo make install
$ sudo cp ./source/backend/cuda/*.so /usr/local/lib/
# don't copy until MNN has solved the issues with the TensorRT backend
# $ sudo cp ./source/backend/tensorrt/*.so /usr/local/lib/

Then sudo make install .

MNN_build_rdy

If all goes well, you have the following folders on your Jetson Nano.

MNN_include

lib MNN Jetson

none

Also note the folder containing the examples.

none

If you want to download some example deep learning models, you can use the command below.

# download some models
$ cd ~/MNN
$ ./tools/script/get_model.sh

4. Benchmarking

With the new CUDA backend, it will be interesting to see how well MNNs perform. Here are some benchmarks. On average, you will get a 40% performance boost when MNN uses CUDA. Note that the Jetson Nano CPU was overclocked to 2014.5 MHz and the GPU was overclocked to 998.4 MHz during testing.
insert image description here

reference list

https://qengineering.eu/install-mnn-on-jetson-nano.html

Guess you like

Origin blog.csdn.net/weixin_43229348/article/details/127662869