[Deepstream learning] Detailed test explanation of C++ Sample application 2 in TX1 module

        Hello everyone, I am Brother Hu, and I have been using NVIDIA Jeston TX1 for a long time. Since this is a module that has been basically discontinued, I am actually worried that many official demos cannot be adapted and run, so it took me a while. Further study to exploit its GPU performance and use various hardware coprocessors to accelerate. This weekend, I will continue to test the C++ DEMO that comes with DeepStream, and start to learn how to use DeepStream. I will share the 5 typical examples that come with it in detail to complete the introductory learning, share it with everyone, and make a note for myself Summarize.

Let me first explain my own test environment, the environment of the hardware good thing TX1+EHub_tx1_tx2_E100 carrier board. For the test hardware EHub_tx1_tx2_E100 carrier board, please refer to: EdgeBox_EHub_tx1_tx2_E100 Development Board Evaluation_Robot Huge’s Blog-CSDN Blog . The system is installed in the environment of ubuntu 18.04. Installed all the cuda suite libraries supporting Nvidia.

Table of contents

0、C/C++ Sample Apps Source Details

References

1、Sample test application 2

1.1 Enter the directory and find the code:

1.2 Compile the file according to the instructions

1.3 Running the tests

1.4 Try to solve the error

1.5 Try to input different scene videos

0、C/C++ Sample Apps Source Details

Official website entry: https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_C_Sample_Apps.html

The DeepStream SDK contains archives of plugins, libraries, applications, and source code. For Debian installations (on Jetson or dGPU) and SDK Manager installations, the sources directory is located at /opt/nvidia/deepstream/deepstream-6.2 sources. For tarballs, the source files are in the extracted deepstream package. DeepStream Python bindings and sample applications are provided as separate packages. For more information, see GitHub - NVIDIA-AI-IOT/deepstream_python_apps: DeepStream SDK Python bindings and sample applications .

DeepStream graphs created using Graph Composer are listed in the Reference Graphs section. See Introduction to Graph Composer for details.

Reference test application Path inside sources directory Description
Sample test application 1 apps/sample_apps/deepstream-test1 Sample of how to use DeepStream elements for a single H.264 stream: filesrc → decode → nvstreammux → nvinfer or nvinferserver (primary detector) → nvdsosd → renderer. This app uses resnet10.caffemodel for detection.
Sample test application 2 apps/sample_apps/deepstream-test2 Sample of how to use DeepStream elements for a single H.264 stream: filesrc → decode → nvstreammux → nvinfer or nvinferserver (primary detector) → nvtracker → nvinfer or nvinferserver (secondary classifier) → nvdsosd → renderer. This app uses resnet10.caffemodel for detection and 3 classifier models (i.e., Car Color, Make and Model).
Sample test application 3 apps/sample_apps/deepstream-test3 Builds on deepstream-test1 (simple test application 1) to demonstrate how to:Use multiple sources in the pipeline.Use a uridecodebin to accept any type of input (e.g. RTSP/File), any GStreamer supported container format, and any codec.Configure Gst-nvstreammux to generate a batch of frames and infer on it for better resource utilization.Extract the stream metadata, which contains useful information about the frames in the batched buffer.This app uses resnet10.caffemodel for detection.
Sample test application 4 apps/sample_apps/­deepstream-test4 Builds on deepstream-test1 for a single H.264 stream: filesrc, decode, nvstreammux, nvinfer or nvinferserver, nvdsosd, renderer to demonstrate how to:Use the Gst-nvmsgconv and Gst-nvmsgbroker plugins in the pipeline.Create NVDS_META_EVENT_MSG type metadata and attach it to the buffer.Use NVDS_META_EVENT_MSG for different types of objects, e.g. vehicle and person.Implement “copy” and “free” functions for use if metadata is extended through the extMsg field.This app uses resnet10.caffemodel for detection.
Sample test application 5 apps/sample_apps/­deepstream-test5 Builds on top of deepstream-app. Demonstrates:Use of Gst-nvmsgconv and Gst-nvmsgbroker plugins in the pipeline for multistream.How to configure Gst-nvmsgbroker plugin from the config file as a sink plugin (for KAFKA, Azure, etc.).How to handle the RTCP sender reports from RTSP servers or cameras and translate the Gst Buffer PTS to a UTC timestamp.For more details refer the RTCP Sender Report callback function test5_rtcp_sender_report_callback() registration and usage in deepstream_test5_app_main.c. GStreamer callback registration with rtpmanager element’s “handle-sync” signal is documented in apps-common/src/deepstream_source_bin.c.This app uses resnet10.caffemodel for detection.

In the following time, we will learn these 5 sample programs through the system.

  • test1: Hello World from DeepStream. Introduce how to build a Gstream pipeline based on various DeepStream plugins. The input in this example is a video file, which is decoded, batch-processed, and detected, and the detected information is displayed on the screen.

  • test2: On the basis of test1, the secondary network is cascaded to the primary network. We can also see in the figure that after the target detection, there is an additional image classification module.

  • test3: Based on test1, how to implement multiple data sources. For example, 4 channels of video are connected at the same time to realize simultaneous reasoning of 4 channels of video data.

  • test4: On the basis of test1, it shows how to use the message broker plug-in to create IoT services.

References

DeepStream SDK Development Guide: NVIDIA DeepStream SDK Developer Guide — DeepStream 6.2 Release documentation

DeepStream Overview: https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_Overview.html

DeepStream data structure: https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_metadata.html

GStreamer study notes: https://www.cnblogs.com/phinecos/archive/2009/06/07/1498166.html

B station DeepStream related video collection: How to download deepstream in jetson nano-哔哩哔哩_Bilibili

1、Sample test application 2

 Then look at the example of deepstream-test2, which is based on test1 to add the function of "multi-level detector". Since this function is bundled with DeepStream's tracking function, it may be activated together.

  • Programming language: C/C++

  • Code amount: 506 lines (including comments)

  • Input source: single H264/H265 video file

  • Intelligent reasoning: a single 4-category (car, person, bicycle, roadsign) main detector, plus 3 secondary detectors based on the "Car" category, including color, brand, model, etc., here you must also turn on "Tracking "tracker" function

  • Display Output: Monitor

  • 插件流:filesrc -> h264parse -> nvv4l2decoder -> nvstreammux -> nvinfer (primary detector) -> nvtracker -> nvinfer(secondary classifier) -> nvvideoconvert -> nvdsosd -> nvegltransform -> nveglglessink

1.1 Enter the directory and find the code:

cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps
cd deepstream-test2

1.2 Compile the file according to the instructions

You can see the compilation steps from the README file:

Compilation Steps:
​
  $ Set CUDA_VER in the MakeFile as per platform.
      For Jetson, CUDA_VER=10.2
      For x86, CUDA_VER=11.4
  $ sudo make

Confirm my system CUDA_VER, I am CUDA_VER=10.2

 Open the MakeFile file to modify

 Compile after saving and exiting:

sudo make
nvidia@ubuntu:/opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-test2$ sudo make
cc -c -o deepstream_test2_app.o -DPLATFORM_TEGRA -I../../../includes -I /usr/local/cuda-10.2/include -pthread -I/usr/include/gstreamer-1.0 -I/usr/include/glib-2.0 -I/usr/lib/aarch64-linux-gnu/glib-2.0/include deepstream_test2_app.c
cc -o deepstream-test2-app deepstream_test2_app.o -lgstreamer-1.0 -lgobject-2.0 -lglib-2.0 -L/usr/local/cuda-10.2/lib64/ -lcudart -L/opt/nvidia/deepstream/deepstream-6.0/lib/ -lnvdsgst_meta -lnvds_meta -Wl,-rpath,/opt/nvidia/deepstream/deepstream-6.0/lib/

Compilation is complete.

1.3 Running the tests

The README file has instructions on how to run

To run:
​
  $ ./deepstream-test2-app <h264_elementary_stream>
​
NOTE: To compile the sources, run make with "sudo" or root permission.

The official samples include H264 video:

cd /opt/nvidia/deepstream/deepstream-6.0/samples/streams

 I also transferred some 264 files in the previous article:

 First use the official sample video test, its directory:

cd /opt/nvidia/deepstream/deepstream-6.0/samples/streams

Go back to the test directory:

cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-test2

Execute the test command: Execute on the NoMachine terminal

./deepstream-test2-app /opt/nvidia/deepstream/deepstream-6.0/samples/streams/sample_720p.h264

 Then you make a cup of tea, wait, and that's it, a long, long wait until the following prints:

 We can review the inference results in test1:

 It is obvious here that test2 can not only mark each detected object with a number, which is the function of "tracking". On the "car" object, there are also information such as "color", "brand", and "model".

1.4 Try to solve the error

Error message:

ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-test2/../../../../samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine open error
​
ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-test2/../../../../samples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine open error
​
ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-test2/../../../../samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine open error
​
ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-test2/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine open error
​

Therefore, the error is that opening the file failed, find these files

/opt/nvidia/deepstream/deepstream-6.0/samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engin
/opt/nvidia/deepstream/deepstream-6.0/ssamples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine
/opt/nvidia/deepstream/deepstream-6.0/ssamples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine
/opt/nvidia/deepstream/deepstream-6.0/ssamples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine
/opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine

Go into a folder and see

#进入
cd /opt/nvidia/deepstream/deepstream-6.0/samples/models/Secondary_VehicleTypes/
#
ll

It is found that there is no file we need: "resnet18.caffemodel_b16_gpu0_int8.engin", replaced by: "resnet18.caffemodel_b16_gpu0_fp16.engine", this time I found that the file name is wrong, so I decisively used the new file name:

#进入测试目录
cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-test2
#依次打开配置文件
sudo vim dstest2_sgie2_config.txt
sudo vim dstest2_sgie1_config.txt
sudo vim dstest2_sgie3_config.txt
sudo vim dstest2_pgie_config.txt
#核对所有需要的engin文件位置和名称和实际是否对应

After modification of dstest2_pgie_config.txt

 dstest2_sgie1_config.txt modified

 After modification of dstest2_sgie2_config.txt

 dstest2_sgie3_config.txt modified

 After the change, let's test it again: Execute the test command: Execute in the NoMachine terminal

./deepstream-test2-app /opt/nvidia/deepstream/deepstream-6.0/samples/streams/sample_720p.h264

 No error is reported, and it can be felt that the loading is fast, that is, our previous test should have the same problem, and this error caused the loading to be very slow.

 Finally found the cause of this problem and solved it! I can sleep peacefully.

1.5 Try to input different scene videos

For the above test, we all input the official video. I transcoded some H.264 stored stream files myself, and conducted some additional tests to see:

 Test video 1:

#进入测试目录
cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-test2
#测试视频1:/home/nvidia/lsrobot_worksapce/Deepstream/mysamples_streams/H264-Streat-1.h264
./deepstream-test2-app /home/nvidia/lsrobot_worksapce/Deepstream/mysamples_streams/H264-Streat-1.h264

 Test video 2:

#进入测试目录
cd /opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-test2
#测试视频2:/home/nvidia/lsrobot_worksapce/Deepstream/mysamples_streams/H264-Streat-2.h264
./deepstream-test2-app /home/nvidia/lsrobot_worksapce/Deepstream/mysamples_streams/H264-Streat-2.h264

The above is what I want to share today. Error correction, questions, communication: [email protected] 

Guess you like

Origin blog.csdn.net/cau_weiyuhu/article/details/129785752