K210 Study Notes - Target Tracking under Trigonometric Functions

Hello friends who are reading the article, I am a non-professional student, if there is something wrong, I look forward to your correction.

Target tracking means: recognize the target object, turn it through the steering gear, and move towards the target object.

Experimental equipment: two degrees of freedom steering gear gimbal plus two SG90 steering gears, K210.

The experiment is divided into two steps: one is to recognize the target object, and the other is to return the position coordinates of the color block after recognizing the color block, and then control the steering gear to rotate and point to the target object.

One: Identify the target color block - image.find_blobs function:

To identify the color square, in the programming environment, open the machine vision and threshold manager in the tool, find a color square on the Internet, or select the best color tracking threshold in the buffer zone, and make the target object in the binary system through continuous adjustment. The image appears white and the rest is black. The string of numbers below is the specific color threshold of the target object.

image.find_blobs(thresholds, roi=Auto, x_stride=2, y_stride=1, invert=False, area_threshold=10, pixels_threshold=10, merge=False, margin=0, threshold_cb=None, merge_cb=None)

The image.find_blobs() function is used to identify the target color block.

thresholds is the color threshold of the target color block.

roi is the region of interest, I only have one color block here, so ignore it.

x_stride and y_stride are the pixels with the minimum width of the color block in the x and y directions respectively. Combined with the distance from the actual camera to the target color block, setting its size can eliminate some errors.

With this function, the following values ​​can be obtained:

blobs[0] = X coordinate of the color patch border, blobs[1] = Y coordinate of the color patch border, blobs[2] = width of the color patch border, blobs[3] = height of the color patch border, blobs[4] = the number of pixels of the color block border, blobs[5] = the abscissa of the center point of the color block border, blobs[6] = the ordinate of the center point of the color block border.

Using these values, we can "circle" the target color block on the display, write words next to it, draw a cross in the center, etc.

The most important thing is to obtain the horizontal and vertical coordinates of the center point (blobs[5], blobs[6]), and use them to determine the position of the color block.

The rotation of the steering gear - Servo function:

The rotation of the steering gear is controlled by PWM, and the internal timer is used to generate pulses with a cycle of 20ms. Only in this cycle, the SG90 steering gear can work, and the high-level duty cycle is controlled by the timer. The function is Packed and ready to use.

def Servo(servo,angle):
    S1.duty((angle+90)/180*10+2.5)

比如输入angle = 0 则S1.duty(7.5),对应上图 0度在20ms的工作脉冲中为7.5%的占空比。

总:

import sensor,image,lcd,time
import math
from machine import Timer,PWM
tim = Timer(Timer.TIMER0, Timer.CHANNEL0, mode=Timer.MODE_PWM)
S1 = PWM(tim, freq=50, duty=0, pin=17)//自由映射在17引脚
S2 = PWM(tim, freq=50, duty=0, pin=15)//自由映射在15引脚

def Servo(servo,angle):
    S1.duty((angle+90)/180*10+2.5)

lcd.init()
sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QVGA)
sensor.skip_frames(time = 2000)
sensor.set_auto_exposure(1)
sensor.set_auto_gain(1)
sensor.set_vflip(1)
sensor.set_hmirror(1)
sensor.run(1)

red_threshold   = ((43, 66, 45, 107, 40, 1))

bfive=[];bsix=[]
while True:
    
    img=sensor.snapshot()
    blobs = img.find_blobs([red_threshold],x_strides=100,y_strides=100)
    if blobs:
        for b in blobs:
            tmp=img.draw_rectangle(b[0:4],color=(225,225,225))##在图像上绘制一个矩形。
            tmp=img.draw_string(b[0],(b[1]-10),"BOX",color=(0,0,255))
            tmp=img.draw_cross(b[5], b[6])##画十字交叉
            bfive.append(b[5]);bsix.append(b[6])
            if len(bfive)==50 and len(bsix)==50:
     

                hengzuobiao=max(bfive,key=bfive.count)
                zongzuobiao=max(bsix,key=bsix.count) 
                print("中心点横坐标为",hengzuobiao)
                print("中心点纵坐标为",zongzuobiao)
                x = hengzuobiao
                y = 240 - zongzuobiao ##相对于舵机云台
                b = 300               ##b为平台到图像之间的距离,假设为300(尚未搭建实验台)
                a = math.sqrt(b*b+(160-x)*(160-x))
                if x <= 160:
                    jiaoA = math.degrees(math.atan((160-x)/b))##角a为水平方向舵机偏转角度,b为舵机云台到图像的距离,为固定已知值
                    jiaoB = math.degrees(math.atan(y/a))      ##角c为垂直方向偏转角度
                else:
                    jiaoA = -math.degrees((math.atan((x-160)/b)))
                    jiaoB = math.degrees(math.atan(y/a))
                bfive=[];bsix=[]
                print("水平方向舵机应偏转角度为:",jiaoA,"度")
                print("垂直方向舵机应偏转角度为:",jiaoB,"度")
                
                Servo(S1,jiaoA)
                time.sleep(1)
                Servo(S2,jiaoB)
                
                time.sleep(1)
                Servo(S1,-jiaoA)
                Servo(S2,-jiaoB)##复位


    lcd.display(img)

math.atan()求出反三角正切值,math.degrees()将弧度制转化为角度值。

注意:

本实验因没有搭建实验台,有些数据是模拟的,若假设摄像截取的图片大小在真实平面中为320dm * 240dm,可参考上面代码,这样对应的坐标值就可以当作长度值用于计算。

垂直方向的舵机起并不是在X-Y平面上的,包括水平方向也是的,有一定高度。

舵机云台放置在正对拍摄面(X-Y),居中位置.

K210返回的坐标值是左上角为原点,水平向右为X轴正方向,垂直向下为Y轴正方向。

实验现象:

Guess you like

Origin blog.csdn.net/qq_62262788/article/details/128909733