Use of the CSK6 Vision AI Development Kit - AI Gesture Recognition + Head Shoulder Recognition + Compile helloworld

This article comes from the CSK6 visual AI development kit activity organized by Jishu Community and Lens Technology. For more development board trial activities, please pay attention to the Jishu Community website. Author: Yang Kefan

foreword

First of all, I am very glad that the Jishu community and Lingsi Technology can give me this rare trial opportunity. With this opportunity, I experienced the CSK6 MCU+DSP+NPU Soc chip and felt its powerful performance.

Kit overview

After getting the kit, I had a preliminary experience of using it. The kit as a whole is composed of the main body of the development board, the camera vision module, a pin expansion board and a network communication module. Among them, the kit is equipped with an acrylic board (I didn’t know what it was at first, hahaha). According to the official documentation, it is used to support the camera module, which is very considerate.

Build the development environment

The first step is to install dependencies
First, install the necessary dependencies on the computer
Enter the command in the CMD of the computer git --versionto check that git has been installed successfully

The second step is to build a development environment
and then install the CSK one-click installation package and run it, and install it according to the installation guide.
The installation process is quite convenient. After downloading, double-click to run, follow the guide to select the installation directory and click Next, and wait for the installation to complete.

Experience the official visual AI project
and then compile it.
Compile the Hello world example
1. First select a directory to store the project we are about to create, execute the following command in this directory lisa zep create, the command will list the current CSK6 adapted project sample, we can choose any one as our project template , created in the current directory. Enter the newly created hello\\_world project directory on the command line, and execute the compile command.

2. Burn the sample program
Use the Type-C data cable to connect to the DAPLINK USB
of the development board **Execute the following command: **lisa zep flash
burns successfully and you will see:

Then open the XCOM serial port tool, and you will see that the compilation is successful.

Experience head and shoulder recognition and gesture recognition
Step 1: Pull the project + initialize
Pull the Sample code from the terminal and initialize

lisa zep create --from-git https://cloud.listenai.com/zephyr/applications/app_algo_hsd_sample_for_csk6.git

Step 2: Modify the configuration
Open the PC-side image preview function Open the file
found in the root directory of the project and change it to . Step 3: Compile the firmwareprj.confCONFIG_WEBUSB=nCONFIG_WEBUSB=y

lisa zep build -b csk6011a_nano

Step 5: Burn the application program (using the serial port)

lisa zep flash
lisa zep exec cskburn -s \\.\COMx -C 6 0x400000 .\resource\cp.bin -b 748800
lisa zep exec cskburn -s \\.\COMx -C 6 0x500000 .\resource\res.bin -b 748800

[External link picture transfer failed, the source site may have an anti-leeching mechanism, it is recommended to save the picture and upload it directly (img-g5HPRae4-1676947558803)(https://pic4.zhimg.com/80/v2-3e36a643d751a1e8be6dc9d81574cd53_1440w.webp)]

Step 6: Check the log output through the serial port assistant tool

Use the PC preview tool to view the image
1. Pull the PC tool project to the local

git clone https://cloud.listenai.com/zephyr/applications/csk_view_finder_spd.git

2. Open the tool
Use a browser to open the file csk_view_finder_spd/srcin the project directory index.html;
3. Install the driver
4. Start using the tool to view

Finally, the head and shoulders detection and gesture recognition can be successfully performed!

Recognition: YES (recognition is correct)

Head and shoulder recognition + gesture recognition, the recognized head and shoulder position will be framed in real time on the screen. If you make a corresponding gesture, the corresponding gesture recognition result and score will be displayed below the frame. The overall effect is not bad, but the output preview The image is still a bit fuzzy at the moment, and I hope it can be optimized in the future.

Summarize

Thanks again to the technical community and Lingsi Technology for the trial opportunity. Through this trial, I have a better understanding of the embedded system and edge computing I have studied. At the same time, I also saw the CSK6 vision development kit in the It is convenient and quick to use in the field of artificial intelligence and deep learning, and it is easy to use. I will continue to explore the functions of this board, hoping to gain something!

Guess you like

Origin blog.csdn.net/weixin_47569031/article/details/129138336