livox_camera_lidar_calibration learning--camera internal parameter calibration

The open source code is located at: GitHub - Shelfcol/livox_camera_lidar_calibration_modified: improvements to livox_camera_lidar_calibration

The code is cameraCalib.cpp

运行: roslaunch camera_lidar_calibration cameraCalib.launch

1. Capture and read 23 captured checkerboard photos

        Photo shooting: Prepare more than 20 photo data, covering all angles and positions. When shooting, the distance should not be too close (about 3 meters), as shown in the picture:

        Mat imageInput = imread(filename);

2. Extract the corner points of the checkerboard

        Size board_size = Size(row_number, col_number); /* Number of corner points in each row and column on the calibration board*/

        vector<Point2f> image_points_buf; /* Cache the corner points detected on each image*/

        0 == findChessboardCorners(imageInput, board_size, image_points_buf)

3. Extract sub-pixel corner points from images with successful corner point extraction.

        Mat view_gray; // Save the corresponding grayscale image

        cvtColor (imageInput, view_gray, cv::COLOR_RGB2GRAY); // Convert to grayscale image

        /* Sub-pixel precision*/

        // image_points_buf is the initial corner point coordinate vector and is also used as the output of the sub-pixel coordinate position.

        // Size(5,5) search window size

        // (-1, -1) means there is no dead zone

        // TermCriteria The termination condition of the iteration process of the corner point, which can be a combination of the number of iterations and the accuracy of the corner point

        cornerSubPix(view_gray, image_points_buf, Size(5, 5), Size(-1, -1), TermCriteria(cv::TermCriteria::EPS + cv::TermCriteria::MAX_ITER, 30, 0.1));

        (The specific principles will be added later)

         drawChessboardCorners (view_gray, board_size, image_points_buf, false); // Used to mark corner points in the picture

4. Set the hypothetical world corner coordinates according to the width and height of the image.

        The height of the boad_size here is the number of corner points in the numerical direction of the checkerboard, and width is the number of corner points in the horizontal direction. Note here that the outermost circle of black blocks is not included. The image below corresponds to width = 8, height = 6. The size of each square is Size square_size(w,h), and the units of w and h are mm.

         It is assumed here that the coordinates of the upper left corner point are (0,0), and the coordinates of other corner points are calculated according to the size of the grid. It is assumed here that the calibration plate is located on the plane of the world coordinate system z=0

/* 初始化标定板上角点的三维坐标,以(0,0)坐标开始,每个角点的长宽由测量确定 */
int i, j, t;
for (t = 0; t<image_count; t++) { // 遍历所有图片数
    vector<Point3f> tempPointSet;
	for (i = 0; i<board_size.height; i++) {
		for (j = 0; j<board_size.width; j++) {
			Point3f realPoint;

			/* 假设标定板放在世界坐标系中z=0的平面上 */
			realPoint.x = i * square_size.width;
			realPoint.y = j * square_size.height; // 右x下y
			realPoint.z = 0;
			tempPointSet.push_back(realPoint);
		}
	}
	object_points.push_back(tempPointSet);
}

5. Calculate the camera internal parameters based on the extracted corner points and the assumed world coordinate system corner points

/* 开始标定 */
// object_points 世界坐标系中的角点的三维坐标
// image_points_seq 每一个内角点对应的图像坐标点
// image_size 图像的像素尺寸大小
// cameraMatrix 输出,内参矩阵
// distCoeffs 输出,畸变系数
// rvecsMat 输出,旋转向量
// tvecsMat 输出,位移向量
// 0 标定时所采用的算法
calibrateCamera(object_points, image_points_seq, image_size, cameraMatrix, distCoeffs, rvecsMat, tvecsMat, 0); // 

6. Perform three-dimensional point reprojection based on the calculated internal parameters to calculate the calibration error

/* Use the obtained internal and external parameters of the camera to re-project the three-dimensional points in the assumed space to obtain new projection points and save them in image_points2*/

Input: the assumed world point, which is the rotation and translation matrix corresponding to the world point, camera internal parameters, distortion coefficient, and output point coordinates projected onto the camera plane.

projectPoints(tempPointSet, rvecsMat[i], tvecsMat[i], cameraMatrix, distCoeffs, image_points2);

Learn a little knowledge here :

         CV_<bit_depth>(S|U|F)C<number_of_channels>

         CV_32FC2 represents 32-bit floating point dual channel

	double total_err = 0.0;         /* 所有图像的平均误差的总和 */
	double err = 0.0;               /* 每幅图像的平均误差 */
	vector<Point2f> image_points2;  /* 保存重新计算得到的投影点 */
	fout << "Average error: \n";

	for (i = 0; i<image_count; i++) {
		vector<Point3f> tempPointSet = object_points[i];

		/* 通过得到的摄像机内外参数,对假定空间的三维点进行重新投影计算,得到新的投影点,保存在image_points2中 */
		projectPoints(tempPointSet, rvecsMat[i], tvecsMat[i], cameraMatrix, distCoeffs, image_points2);

		/* 计算新的投影点和旧的投影点之间的误差*/
		vector<Point2f> tempImagePoint = image_points_seq[i]; // 第i张图片的角点坐标
		Mat tempImagePointMat = Mat(1, tempImagePoint.size(), CV_32FC2); // 双通道数据,每个存放Vec2f
		Mat image_points2Mat  = Mat(1, image_points2.size(),  CV_32FC2);

		for (unsigned int j = 0; j < tempImagePoint.size(); j++) {
			image_points2Mat.at<Vec2f>(0, j) = Vec2f(image_points2[j].x, image_points2[j].y);
			tempImagePointMat.at<Vec2f>(0, j) = Vec2f(tempImagePoint[j].x, tempImagePoint[j].y);
		}
		err = norm(image_points2Mat, tempImagePointMat, NORM_L2); // 计算L2距离
		total_err += err /= point_counts[i];
		fout << "The error of picture " << i + 1 << " is " << err << " pixel" << endl;
	}
	fout << "Overall average error is: " << total_err / image_count << " pixel" << endl << endl;

7. Calibration results

        1 3x3 internal parameter matrix, 5 distortion correction parameters (k1, k2, p1, p2, k3)

8. Matlab calibrate camera internal parameters

        The cameraCalibrator tool in Matlab can calibrate internal parameters. We only need the 1, 2 and 11 data in the result.

Guess you like

Origin blog.csdn.net/qq_38650944/article/details/124120612