Deploy YOLOv8 model using OpenVINOSharp on Intel® Developer Kit

  1. Article author: Yan Guojin Intel Edge Computing Innovation Ambassador

Technical guidance: Wu Zhuo, Li Yiwei

The OpenVINO toolkit can speed up the development of deep learning vision applications and help users deploy AI models to production systems more conveniently and quickly on various Intel platforms from edge to cloud.

C# is a safe, stable, simple, and elegant object-oriented programming language derived from C and C++. It combines the simple visual operation of VB and the high operating efficiency of C++, and becomes the first choice for .NET development. language. However, OpenVINO™ does not provide a C# language interface, which brings some inconvenience to the use of OpenVINO™ in C#. In our previous work, we created an open source, commercially free OpenVINOSharp toolkit, aiming to promote OpenVINO™ in the C # field The application has been successfully implemented and used on the Windows platform .

In this article, we will introduce how to implement OpenVINOSharp based on Linux system on the Intel®  Developer Kit AIxBoard development board .

The sample code used in this article has been open sourced in the OpenVINOSharp repository, and the GitHub URL is:

https://github.com/guojin-yan/OpenVINOSharp/blob/openvinosharp3.0/tutorial_examples/AlxBoard_deploy_yolov8/Program.cs

1.1  Introduction to  Intel® Developer Kit AIxBoard

Figure 1 Introduction to AIxBoard

1.1.1  Product positioning

Intel®  Developer Kit AIxBoard (AixBoard) is a member of the official series of Intel development kits , designed for entry-level artificial intelligence applications and edge smart devices. Xboard can perfectly compete in different application scenarios such as artificial intelligence learning, development, training, and application. The kit comes preloaded with the Intel OpenVINO™ toolkit, model repository, and demos

The main interface of the kit is compatible with the Jetson Nano carrier board, and the GPIO is compatible with the Raspberry Pi, which can maximize the reuse of mature ecological resources. This enables the kit to be used as an edge computing engine to provide strong support for artificial intelligence product verification and development; at the same time, it can also be used as a domain control core to provide technical support for robot product development.

Using the AIxBoard (AixBoard) development kit, you will be able to build an excellent artificial intelligence application in a short time. Whether it is used for scientific research, education or business, X-Board can provide you with good support. With the help of the OpenVINO™ toolkit, both the CPU and iGPU have powerful AI reasoning capabilities, and support the parallel operation of multiple neural networks in applications such as image classification, object detection, segmentation, and speech processing.

1.1.2  Product parameters

master control

Intel Celeron N5105 2.0-2.9GHz (formerly Jasper Lake)

Memory

Onboard LPDDR4x 2933MHz, 4GB/6GB/8GB

storage

Onboard 64GB eMMC storage

storage expansion

1 M.2 Key-M 2242 expansion slot, support SATA&NVME protocol

BIOS

AMI UEFI BIOS

System Support

Ubuntu20.04 LTS

Winodws 10/11

1.1.3  AI reasoning unit

With the help of the OpenVINO™ tool suite, CPU+iGPU heterogeneous computing inference can be realized, and the computing power of iGPU (integrated graphics card) is about 0.6TOPS

CPU

INT8/FP16/FP32

iGPU

INT8/FP16 0.6TOPS

GNA

Gaussian and Neural Accelerator

1.2  Configure the .NET environment

.NET is a free, cross-platform, open source developer platform for building a wide variety of applications. The following will demonstrate how to install the .NET environment on Ubuntu 20.04 , which supports .NET Core 2.0-3.1 series and .NET 5-8 series. If you are using other Linux systems, you can refer to Install .NET on Linux distributions - .NET | Microsoft Learn .

1.2.1 Add Microsoft package repository

Installation using APT can be accomplished with a few commands. Before installing .NET, run the following commands to add the Microsoft package signing key to the list of trusted keys and add the package repository.

Open a terminal and run the following command:

wget https://packages.microsoft.com/config/ubuntu/20.04/packages-microsoft-prod.deb -O packages-microsoft-prod.deb
sudo dpkg -i packages-microsoft-prod.deb
rm packages-microsoft-prod.deb

The following figure shows the output of the console after entering the above command:

Figure 2 1.2.1 Add Microsoft package repository output

1.2.2  Install the SDK

The .NET SDK enables you to develop applications through .NET. If you install the .NET SDK, you don't need to install the corresponding runtime. To install the .NET SDK, run the following command:

sudo apt-get update
sudo apt-get install -y dotnet-sdk-3.1

The following figure shows the console output after installation:

Figure 3 Install SDK output

1.2.3  Test installation

You can check the SDK version and runtime version through the command line.

dotnet --list-sdks
dotnet --list-runtimes

The following figure shows the output of the console after entering the test command:

Figure 4 SDK version and Runtime version

1.2.4  Test console project

In the linux environment, we can create and compile projects through the dotnet command. The project creation command is:

dotnet new <project_type> -o <project name>

Here we create a simple test console project:

dotnet new console -o test_net6.0
cd test_net6.0
dotnet run

The figure below shows the output of the console and the project folder file after inputting the test command. The C# project will automatically create a Program.cs program file, which contains the main function of the program running entry, and will also create a * . Specify some configurations in project compilation.

Figure 5 console project

​The above are the configuration steps for the .NET environment. If your environment does not match this article, you can get more installation steps through .NET Documentation | Microsoft Learn .

1.3 Install OpenVINO™ Runtime

OpenVINO™ has two installation options:

  • OpenVINO™ Runtime, which contains core libraries for running model deployment inference on processor devices
  • OpenVINO™ Development Tools is a set of tools for working with OpenVINO™ and OpenVINO™ models, including model optimizer, OpenVINO™ Runtime, model downloader, etc.

Here we only need to install OpenVINO™ Runtime.

1.3.1  Downloading OpenVINO™ Runtime

Visit the Download the Intel Distribution of OpenVINO Toolkit page, and select the corresponding installation option according to the following process. On the download page, since our device uses Ubuntu20.04, download according to the specified compiled version when downloading.

Figure 6 OpenVINO Runtime download

1.3.2  Unzip the installation package

The OpenVINO™ Runtime we downloaded is essentially a C++ dependency package , so we put it in our system directory, so that dependencies will be obtained according to the set system variables when compiling. First create a folder under the system folder:

sudo mkdir -p /opt/intel

Then unzip the installation file we downloaded and move it to the specified folder:

tar -xvzf l_openvino_toolkit_ubuntu20_2023.0.1.11005.fa1c41994f3_x86_64.tgz
sudo mv l_openvino_toolkit_ubuntu20_2023.0.1.11005.fa1c41994f3_x86_64 /opt/intel/openvino_2022.3.0

1.3.3  Installation dependencies

Next, we need to install all the dependencies of OpenVINO™ Runtime, just enter the following command through the command line:

cd /opt/intel/openvino_2022.3.0/
sudo -E ./install_dependencies/install_openvino_dependencies.sh

Figure 7 Installing OpenVINO™ Runtime dependencies

1.3.4 Configure environment variables

After the installation is complete, we need to configure the environment variables to ensure that the system can obtain the corresponding files when calling. Enter the following command through the command line:

source /opt/intel/openvino_2022.3.0/setupvars.sh

The above are the configuration steps of the OpenVINO™ Runtime environment. If your environment does not match this article, you can get more installation steps through Install OpenVINO™ Runtime — OpenVINO™ documentation — Version (2023.0) .

1.4  Configure the AlxBoard_deploy_yolov8 project

The code used in the project has been placed in the GitHub warehouse AlxBoard_deploy_yolov8 , and you can download and use it yourself according to the situation. Next, I will build the AlxBoard_deploy_yolov8 project step by step from scratch.

1.4.1 Create AlxBoard_deploy_yolov8 project

In this project, we need to use OpenCvSharp, which currently supports up to .NET Core 3.1 on the Ubutun platform, so we create a .NET Core 3.1 project here, and use Terminal to enter the following command to create and open the project file:

dotnet new console --framework "netcoreapp3.1" -o AlxBoard_deploy_yolov8
cd AlxBoard_deploy_yolov8

Figure 8 1.4.1 Create AlxBoard_deploy_yolov8 project

After the project is created, replace the code content of AlxBoard_deploy_yolov8 into the Program.cs file in the created project.

1.4.2 Add OpenVINOSharp dependency

Since OpenVINOSharp is currently in the development stage and the Linux version of the NuGet Package has not been created, it needs to be used as a project reference by downloading the project source code.

  • download source code

Download the project source code through Git , create a new Terminal , and enter the following command to clone the remote warehouse, and place the project in the same directory as AlxBoard_deploy_yolov8 .

git clone https://github.com/guojin-yan/OpenVINOSharp.git
cd OpenVINOSharp

The project directory of this article is:

Program

--|-AlxBoard_deploy_yolov8
--|-OpenVINOSharp
  • Modify OpenVINO™ dependencies

Since the OpenVINO™ dependency of the project source code is different from the settings in this article, it is necessary to modify the path of the OpenVINO™ dependency, mainly by modifying the OpenVINOSharp/src/OpenVINOSharp/native_methods/ov_base.csfile. The modified content is as follows:

private const string dll_extern = "./openvino2023.0/openvino_c.dll";
---修改为--->
private const string dll_extern = "libopenvino_c.so";
  • Add project dependencies

Enter the following command in Terminal to add OpenVINOSharp to the AlxBoard_deploy_yolov8 project reference.

dotnet add reference ./../OpenVINOSharp/src/OpenVINOSharp/OpenVINOSharp.csproj
  • Add environment variables

This project needs to call the OpenVINO™ dynamic link library, so the OpenVINO™ dynamic link library path needs to be added in the current environment:

1.4.3 Add OpenCvSharp

  • Install NuGet Packages

OpenCvSharp can be installed through NuGet Package, just enter the following command in Terminal:

dotnet add package OpenCvSharp4_.runtime.ubuntu.20.04-x64
dotnet add package OpenCvSharp4
  • Add environment variables

Add the following paths to environment variables:

export LD_LIBRARY_PATH=/home/ygj/Program/OpenVINOSharp/tutorial_examples/AlxBoard_deploy_yolov8/bin/Debug/netcoreapp3.1/runtimes/ubuntu.20.04-x64/native

/bin/Debug/netcoreapp3.1/runtimes/ubuntu.20.04-x64/nativelibOpenCvSharpExtern.soIt is the path generated after compiling AlxBoard_deploy_yolov8, and files are stored in this path , which are mainly various interfaces in the encapsulated OpenCV. You can also copy the file to the project running path.

  • Detect libOpenCvSharpExtern dependency

Since libOpenCvSharpExtern.so is a dynamic link library compiled in other environments, the local computer may lack the corresponding dependencies, so it can be ldddetected by commands.

ldd libOpenCvSharpExtern.so

Figure 9 Detecting libOpenCvSharpExtern dependencies

If there is no one in the output no found, it means that there is no missing dependency. If it exists, you need to install the missing dependency before it can be used normally.

After adding project dependencies and NuGet Package, the content of the project configuration file is:

<Project Sdk="Microsoft.NET.Sdk">

  <ItemGroup>

    <ProjectReference Include="..\OpenVINOSharp\src\OpenVINOSharp\OpenVINOSharp.csproj" />

  </ItemGroup>

  <ItemGroup>

    <PackageReference Include="OpenCvSharp4" Version="4.8.0.20230708" />

    <PackageReference Include="OpenCvSharp4_.runtime.ubuntu.20.04-x64" Version="4.8.0.20230708" />

  </ItemGroup>

  <PropertyGroup>

    <OutputType>Exe</OutputType>

    <TargetFramework>netcoreapp3.1</TargetFramework>

  </PropertyGroup>

</Project>

1.5 Run the AlxBoard_deploy_yolov8 project

The models and files used in the project test can be found in OpenVINOSharp, so let's test the models and files under the OpenVINOSharp repository.

To run through dotnet, just run the following command.

dotnet run <args>

The <args> parameter setting refers to the model prediction type, model path, and image file path parameters. The prediction type input includes four types: 'det', 'seg', 'pose', and 'cls'; the default inference device is set to ' AUTO', for 'det', 'seg' prediction, you can set the <path_to_lable> parameter, if this parameter is set, the result will be drawn on the picture, if not set, it will be printed out through the console.

1.5.1 Compile and run the Yolov8-det model

The compile and run command is :

dotnet run det /home/ygj/Program/OpenVINOSharp/model/yolov8/yolov8s.xml /home/ygj/Program/OpenVINOSharp/dataset/image/demo_2.jpg GPU.0 /home/ygj/Program/OpenVINOSharp/dataset/lable/COCO_lable.txt

The output of model inference is :

---- OpenVINO INFO----
Description : OpenVINO Runtime
Build number: 2023.0.1-11005-fa1c41994f3-releases/2023/0
Set inference device GPU.0.
[INFO] Loading model files: /home/ygj/Program/OpenVINOSharp/model/yolov8/yolov8s.xml
[INFO] model name: torch_jit
[INFO]   inputs:
[INFO]     input name: images
[INFO]     input type: f32
[INFO]     input shape: Shape : [1, 3, 640, 640]
[INFO]   outputs:
[INFO]     output name: output0
[INFO]     output type: f32
[INFO]     output shape: Shape : [1, 84, 8400]
[INFO] Read image files: /home/ygj/Program/OpenVINOSharp/dataset/image/demo_2.jpg
​
Detection result :
​
1: 0 0.89   (x:744 y:43 width:388 height:667)
2: 0 0.88   (x:149 y:202 width:954 height:507)
3: 27 0.72   (x:435 y:433 width:98 height:284)

​Figure 10 Yolov8-det model prediction output

1.5.2 Compile and run the Yolov8-cls model

The compile and run command is:

dotnet run cls /home/ygj/Program/OpenVINOSharp/model/yolov8/yolov8s-cls.xml /home/ygj/Program/OpenVINOSharp/dataset/image/demo_7.jpg GPU.0

The output of model inference is :

---- OpenVINO INFO----
Description : OpenVINO Runtime
Build number: 2023.0.1-11005-fa1c41994f3-releases/2023/0
Set inference device GPU.0.
[INFO] Loading model files: /home/ygj/Program/OpenVINOSharp/model/yolov8/yolov8s-cls.xml
[INFO] model name: torch_jit
[INFO]   inputs:
[INFO]     input name: images
[INFO]     input type: f32
[INFO]     input shape: Shape : [1, 3, 224, 224]
[INFO]   outputs:
[INFO]     output name: output0
[INFO]     output type: f32
[INFO]     output shape: Shape : [1, 1000]
[INFO] Read image files: /home/ygj/Program/OpenVINOSharp/dataset/image/demo_7.jpg
​
​
Classification Top 10 result :
​
classid probability
------- -----------
294     0.992173
269     0.002861
296     0.002111
295     0.000714
270     0.000546
276     0.000432
106     0.000159
362     0.000147
260     0.000078
272     0.000070

1.5.3 Compile and run the Yolov8-pose model

The compile and run command is :

dotnet run pose /home/ygj/Program/OpenVINOSharp/model/yolov8/yolov8s-pose.xml /home/ygj/Program/OpenVINOSharp/dataset/image/demo_9.jpg GPU.0

The output of model inference is :

---- OpenVINO INFO----
Description : OpenVINO Runtime
Build number: 2023.0.1-11005-fa1c41994f3-releases/2023/0
Set inference device GPU.0.
[INFO] Loading model files: /home/ygj/Program/OpenVINOSharp/model/yolov8/yolov8s-pose.xml
[INFO] model name: torch_jit
[INFO]   inputs:
[INFO]     input name: images
[INFO]     input type: f32
[INFO]     input shape: Shape : [1, 3, 640, 640]
[INFO]   outputs:
[INFO]     output name: output0
[INFO]     output type: f32
[INFO]     output shape: Shape : [1, 56, 8400]
[INFO] Read image files: /home/ygj/Program/OpenVINOSharp/dataset/image/demo_9.jpg
​
​
Classification result :
​
1: 1   0.94   (x:104 y:22 width:152 height:365) Nose: (188 ,60 ,0.93) Left Eye: (192 ,53 ,0.83) Right Eye: (180 ,54 ,0.90) Left Ear: (196 ,53 ,0.50) Right Ear: (167 ,56 ,0.76) Left Shoulder: (212 ,92 ,0.93) Right Shoulder: (151 ,93 ,0.94) Left Elbow: (230 ,146 ,0.90) Right Elbow: (138 ,142 ,0.93) Left Wrist: (244 ,199 ,0.89) Right Wrist: (118 ,187 ,0.92) Left Hip: (202 ,192 ,0.97) Right Hip: (168 ,193 ,0.97) Left Knee: (184 ,272 ,0.96) Right Knee: (184 ,276 ,0.97) Left Ankle: (174 ,357 ,0.87) Right Ankle: (197 ,354 ,0.88)

Figure 11 Yolov8-pose model prediction output

1.5.4 Compile and run the Yolov8-seg model

The compile and run command is :

dotnet run seg /home/ygj/Program/OpenVINOSharp/model/yolov8/yolov8s-seg.xml /home/ygj/Program/OpenVINOSharp/dataset/image/demo_2.jpg GPU.0 /home/ygj/Program/OpenVINOSharp/dataset/lable/COCO_lable.txt

The output of model inference is:

---- OpenVINO INFO----
Description : OpenVINO Runtime
Build number: 2023.0.1-11005-fa1c41994f3-releases/2023/0
Set inference device GPU.0.
[INFO] Loading model files: /home/ygj/Program/OpenVINOSharp/model/yolov8/yolov8s-seg.xml
47
[INFO] model name: torch_jit
[INFO]   inputs:
[INFO]     input name: images
[INFO]     input type: f32
[INFO]     input shape: Shape : [1, 3, 640, 640]
[INFO]   outputs:
[INFO]     output name: output0
[INFO]     output type: f32
[INFO]     output shape: Shape : [1, 116, 8400]
[INFO] Read image files: /home/ygj/Program/OpenVINOSharp/dataset/image/demo_2.jpg


Segmentation result : 

1: 0 0.90   (x:745 y:42 width:403 height:671)
2: 0 0.86   (x:121 y:196 width:1009 height:516)
3: 27 0.69   (x:434 y:436 width:90 height:280)

Figure 12 Yolov8-seg model prediction output

​​​​1.6  Running time

The AIxBoard development board is equipped with N5105 CPU and UHD integrated graphics card. Here, a simple test is made on the inference performance of CPU and iGPU. The inference time of the model is mainly detected, and the synchronization test is carried out using Intel Phantom Canyon. The test results are shown in the table Show.

Device

CPU: N5105

UHD integrated display

CPU: i7-1165G7

Iris Xe integrated display

Yolov8-it

586.3ms

83.1 ms

127.1 ms

19.5ms

Yolov8-seg

795.6ms

112.5ms

140.1 ms

25.0ms

Yolov8-pose

609.8ms

95.1 ms

117.2ms

23.3ms

Yolov8-cls

33.1 ms

9.2ms

6.1 ms

2.7ms

It can be seen that the Intel Celeron N5105 CPU is very powerful in model reasoning performance: for the Yolov8 model, the average processing speed can reach 10FPs.

1.7 Summary

In this project, based on the Ubutn 20.04 system, we successfully implemented the deployment of the deep learning model by calling OpenVINO™ in the C# environment, and verified the feasibility of the OpenVINOSharp project in the Linux environment, which is very useful for the development of OpenVINOSharp in the Linux environment. Significance.

Welcome to the OpenVINOSharp code warehouse: https://github.com/guojin-yan/OpenVinoSharp

Guess you like

Origin blog.csdn.net/gc5r8w07u/article/details/132208247