About BumblebeeXB3 and hand-eye calibration conversion related information collation and code display

A long time ago, I wanted to sort out the projects that I had done in recent years. Now there is just enough time to encourage me, come on! YapethsDY


Let
me briefly introduce the general situation of the project : using Siasun self-developed multi-joint collaborative robot + Bumblebee XB3 camera (resolution 1024*768) + development environment VS2010 + Halcon 18.05. Modify: 2020/7/13 add camera coordinate system


The first step is to clarify the BBX2 and BBX3 coordinate systems

Method: Hold the camera pin in your hand and look at the camera's 1394A connector. The right side is the right-eye camera, the left side is the left-eye camera, the front of the camera is the positive Z direction, and the center of the right eye away from the left-eye camera is the positive X direction. Then according to the right-hand rule, the Y direction of the camera can be determined.

  • Involved coordinate system
  1. Workpiece coordinate system
  2. Tool coordinate system
  3. Camera coordinate system

There is an image coordinate system and a pixel coordinate system in the camera coordinate system

The BumblebeeXB3 camera used in this project has the origin of the coordinates at the center of the right eye.

The second step hand-eye calibration form

According to whether the camera is fixed relative to the origin of the robot coordinate, we can divide it into the following two modes:

  • Camera fixed (Eye to hand)
  • The camera is not fixed (Eye in hand)

The final result we need is to obtain the spatial position relationship between the camera and the tool, which is a 4*4 NOAP matrix. This can be done by fixing the calibration board and changing the pose of the robot to make the camera shoot the calibration board multiple times at different positions and angles. To calculate. (Can't draw pictures, pictures are excerpts)

  • The relative position of the camera and the tool remains unchanged, and the change matrix is ​​set as .
  • The transformation relationship from the tool coordinate system to the base standard system is also certain, let the transformation matrix be .
  • After the robot is placed in a positioning position, the workpiece coordinate system is also constant relative to the base coordinate system, and a transformation matrix is ​​set .

 

After the conversion relationship between the camera and the robot coordinate system is calculated, subsequent operations can be performed.


The third step of hand-eye calibration

  • Get camera internal parameters

There are 8 parameters related to camera calibration: Focus, Kappa, Sy, Sx, Cx, Cy, ImageWidth, ImageHeight.

/// <summary>
/// 相机内参
/// 确定摄像机从三维空间到二维图像的投影关系
/// </summary>
/// <param name="fFocal">镜头焦距 单位m</param>
/// <version BBX3-13S2C-60指的是焦距为0.006m>
/// <version BBX3-13S2G-38指的是焦距为0.0038m>
/// <前者为彩色相机,后者为灰度相机>
/// <param name="fCenterRow">图像中心坐标行</param>
/// <param name="fCenterCol">图像中心坐标列</param>
/// <param name="fKappa">径向畸变量级<0则为桶型,>0则为枕型,在标定的时候设置为0</param>
/// <缩放比例因子,对相机来讲,表示图像传感器上水平和垂直方向上相邻像素之间的距离>
/// <像素在世界坐标系中的尺寸>
/// <param name="fSx"></param>
/// <param name="fSy"></param>
/// <param name="nImageHeight">图像分辨率高</param>
/// <param name="nImageWidth">图像分辨率宽</param>
/// <returns>执行结果</returns>
bool CFastRecognizeAndLocattedVisionDoc::GetCameraParam(CString strFilePath)
{
	if (m_bCameraConnectted)
	{
		HTuple hv_CamParam;
		float fFocalLength	= 0.0f;
		//镜头焦距 单位m
		float fFocal		= 0.0038f;
		float fCenterRow	= 0.0f;
		float fCenterCol	= 0.0f; 
		float fKappa		= 0.0f;
		float fSx			= 0.0f;
		float fSy			= 0.0f;
		int nImageWidth		= 0;
		int nImageHeight	= 0;
		CString strCamParamFileName;

		triclopsGetImageCenter(m_triclopsContext,&fCenterRow,&fCenterCol);
		//归一化焦距
		triclopsGetFocalLength(m_triclopsContext,&fFocalLength);
		triclopsGetResolution(m_triclopsContext,&nImageHeight,&nImageWidth);

		fSx = fFocal / fFocalLength;
		fSy = fFocal / fFocalLength;
		strCamParamFileName = strFilePath + CAM_PARAM_FILE_NAME;
		WriteCameraParamIntoFile("area_scan_division",fFocal,fKappa,fSx,fSy,fCenterCol,fCenterRow,nImageWidth,nImageHeight,strCamParamFileName);
		CString strText;
		strText.Format(_T("相机内参文件写入成功,路径为%s"),strCamParamFileName);
		m_LogUtl.AddMsgInLogFile(strText);
		return true;
	}
	else
	{
		m_LogUtl.AddMsgInLogFile(_T("相机未连接,无法获取相机内参"));
		return false;
	}
}
  • After obtaining the internal parameters of the camera, start to move the end of the robot (camera follow-up), grab the right-eye correction image of the shooting camera (fixed calibration board on the shooting ground), pay attention to the surrounding environment during shooting, observe the image imaging effect, and pay special attention to ensure that every The amplitude of the second mobile robot should be large (the multi-axis angle change is more than 5 degrees), and 10-15 groups of images are required. The tool coordinate system can be selected when storing the end pose of the robot, and the F2 format is followed when saving the end pose. The calibration code is as follows:

bool CFastRecognizeAndLocattedVisionDoc::CalculateCalibrationTransform(CString strFileName, int nDataNum, TRANSFORMPOSE& transRobotCalibration)
{
	m_LogUtl.AddMsgInLogFile(_T("系统开始执行标定矩阵计算"));
	// Local iconic variables
	HObject  ho_Image, ho_Caltab;

	// Local control variables
	HTuple  hv_ImageNameStart, hv_DataNameStart, hv_NumImages;
	HTuple  hv_Width, hv_Height, hv_WindowHandle, hv_ParamName;
	HTuple  hv_ParamValue, hv_Labels, hv_Instructions, hv_ArrowThickness;
	HTuple  hv_ArrowLength, hv_OM3DToolOrigin, hv_OM3DBase;
	HTuple  hv_CalTabFile, hv_StartCamParam, hv_CalibDataID;
	HTuple  hv_WindowHandleR, hv_I, hv_RCoord, hv_CCoord, hv_Index;
	HTuple  hv_PoseForCalibrationPlate, hv_ToolInBasePose, hv_PoseIn;
	HTuple  hv_PoseOut, hv_OM3DTool, hv_Warnings, hv_Errors;
	HTuple  hv_CamCalibError, hv_CamParam, hv_ToolInCamPose;
	HTuple  hv_CalObjInBasePose, hv_PlaneInBasePose, hv_Exception;
	HTuple  hv_CalibObjIdx, hv_PoseIds, hv_CameraSize, hv_CameraConeLength;
	HTuple  hv_PX, hv_PY, hv_PZ, hv_OM3DObjectOrig, hv_OM3DObject;
	HTuple  hv_CalObjInCamPose, hv_Message, hv_OM3DCamera, hv_ObjInCamPose;
	HTuple  hv_CamInToolPose, hv_CamInBasePose, hv_ObjInBasePose;

	hv_ImageNameStart = (LPCTSTR)(strFileName + m_strCalibrationImage);
	hv_DataNameStart = (LPCTSTR)(strFileName + m_strCalibrationRobotPose);
	hv_NumImages = nDataNum;

	hv_CalTabFile = (LPCTSTR)(strFileName + CALTAB_FILE_NAME);
	//读取相机内参
	ReadCamPar((LPCTSTR)(strFileName + CAM_PARAM_FILE_NAME), &hv_StartCamParam);
	//创建Halcon标定数据模型返回输出数据模型句柄
	CreateCalibData("hand_eye_moving_cam", 1, 1, &hv_CalibDataID);
	//设置在相机校准模型的类型和初始参数
	SetCalibDataCamParam(hv_CalibDataID, 0, HTuple(), hv_StartCamParam);
	//在标定模型中设定标定对象 设定标定对象句柄索引 标定板坐标点储存地址
	SetCalibDataCalibObject(hv_CalibDataID, 0, hv_CalTabFile);
	//设定标定数据模型最优化的方式
	SetCalibData(hv_CalibDataID, "model", "general", "optimization_method", "nonlinear");
	{
		HTuple end_val72 = hv_NumImages-1;
		HTuple step_val72 = 1;
		//遍历处理每个标定位姿的标定图像
		for (hv_I=0; hv_I.Continue(end_val72, step_val72); hv_I += step_val72)
		{
			ReadImage(&ho_Image, hv_ImageNameStart+(hv_I.TupleString("02d")));
			//搜索标定区域并识别标定轮廓和标定点信息
			//获取标定板轮廓信息
			//获取标定板标定点信息
			//若图像太亮会出现一点问题 可以可使用HALCON进行验证
			FindCalibObject(ho_Image, hv_CalibDataID, 0, 0, hv_I, HTuple(), HTuple());
			GetCalibDataObservContours(&ho_Caltab, hv_CalibDataID, "caltab", 0, 0, hv_I);
			GetCalibDataObservPoints(hv_CalibDataID, 0, 0, hv_I, &hv_RCoord, &hv_CCoord, &hv_Index, &hv_PoseForCalibrationPlate);
			//hv_ToolInBasePose指机器人末端p点位姿
			ReadPose((hv_DataNameStart+(hv_I.TupleString("02d")))+".dat", &hv_ToolInBasePose);
			if (0 != (hv_I==0))
			{
				hv_PoseIn.Clear();
				hv_PoseIn[0] = -0.006;
				hv_PoseIn[1] = -0.296;
				hv_PoseIn[2] = 12;
				hv_PoseIn[3] = 178;
				hv_PoseIn[4] = 2;
				hv_PoseIn[5] = 270;
				hv_PoseIn[6] = 0;
			}
			else
			{
				hv_PoseIn = hv_PoseOut;
			}
			//将工具初始坐标OM3DToolOrigin值通过ToolInBasePose即工具坐标在机器人基坐标中的位姿转换矩阵转换为工具在基坐标下的表示形式 OM3DTool
			RigidTransObjectModel3d(hv_OM3DToolOrigin, hv_ToolInBasePose, &hv_OM3DTool);
			SetCalibData(hv_CalibDataID, "tool", hv_I, "tool_in_base_pose", hv_ToolInBasePose);
		}
	}
	//开始进行手眼标定
	//从手眼标定结果中获取所需的参数具体数值
	CalibrateHandEye(hv_CalibDataID, &hv_Errors);
	GetCalibData(hv_CalibDataID, "model", "general", "camera_calib_error", &hv_CamCalibError);
	GetCalibData(hv_CalibDataID, "camera", 0, "params", &hv_CamParam);
	GetCalibData(hv_CalibDataID, "camera", 0, "tool_in_cam_pose", &hv_ToolInCamPose);
	GetCalibData(hv_CalibDataID, "calib_obj", 0, "obj_in_base_pose", &hv_CalObjInBasePose);
	//考虑标定板的厚度 延z向做一个0.005m的平移
	SetOriginPose(hv_CalObjInBasePose, 0, 0, 0.005, &hv_PlaneInBasePose);
	ClearCalibData(hv_CalibDataID);
	//获取相机坐标系在机器人坐标系下的坐标
	PoseInvert(hv_ToolInCamPose, &hv_CamInToolPose);
	
	transRobotCalibration.PointX = float(hv_CamInToolPose[0].D() * 1000);
	transRobotCalibration.PointY = float(hv_CamInToolPose[1].D() * 1000);
	transRobotCalibration.PointZ = float(hv_CamInToolPose[2].D() * 1000);
	transRobotCalibration.RollAngle = float(hv_CamInToolPose[3].D());
	transRobotCalibration.PitchingAngle = float(hv_CamInToolPose[4].D());
	transRobotCalibration.YawAngle = float(hv_CamInToolPose[5].D());

	CString str;
	str.Format(_T("机器人和视觉标定完成,标定结果为:横滚%5.2f,俯仰%5.2f,偏航%5.2f,X%5.2f,Y%5.2f,Z%5.2f")
		,transRobotCalibration.RollAngle,transRobotCalibration.PitchingAngle,transRobotCalibration.YawAngle
		,transRobotCalibration.PointX,transRobotCalibration.PointY,transRobotCalibration.PointZ);
	m_LogUtl.AddMsgInLogFile(str);

	return true;
}

During calibration:

  • Direct and inverse kinematics equations involving attitude
  • It will involve roll angle, pitch angle, yaw angle (Roll, Pitch, Yaw, RPY)
  • Euler Point

The roll angle, pitch angle, and yaw angle are respectively rotated around the a, o, and n axes in three sequences, which can adjust the robot to the desired posture.

/// Around the a axis (the z axis of the motion coordinate system) is called rolling;
/// Around the o axis (the y axis of the motion coordinate system) is called pitch;
/// Around the n axis (the x axis of the motion coordinate system) Called yaw

It will involve the forward and inverse conversion between the end position and attitude and the NOAP matrix. The specific implementation process will upload the engineering code later.

Happy Ending! Welcome everyone to correct me! Dingyi 2020/08/24

 

 

 

Guess you like

Origin blog.csdn.net/Ding86341631/article/details/108192459