相机标定(由像素坐标求三维坐标的实现)
公式的推导
相机的内参与外参的获得
根据像素坐标求三维坐标编程实现
**相机标定(几何意义上的坐标系的转化):链接: link.
相机标定的公式如下图所示,其中s为未知的比例系数,我们所要求得,内参,外参是通过Matlab得到的,Matlab的得到的内参记得转置。
对于三维矩阵,矩阵的行列式不为0,没有对空间进行压缩(降维),那么变换矩阵乘以变换矩阵的逆,那么空间又被还原,等于一个什么都不做的变换。
首先我们分别在等式两边的左边同时乘以内参矩阵与旋转矩阵的逆。
所以a31su+a32sv+s=Zw+b13,一般我们取Zw等于0,因为标定板所在的平面的为原点,现在我们s能求出来,
求解比例系数s以及三维坐标的代码实现如下:
/** @brief 相机标定求标定板所在平面的三维坐标
* @param u 像素坐标U
* @param u 像素坐标V
* @param p 所求的三维坐标
*/
int camera2CalibrationPlate(double &u, double &v, cv::Point3d &p)
{
double Zw = 0;
double s;
Mat_<double>R = (Mat_<double>(3, 3) << //外参旋转矩阵
0.999688578967552, 0.02231098209332011, -0.01117878168084797,
-0.01754908117277992, 0.9470136158049239, 0.3207136436606262,
0.01774189482066936, -0.3204175893394672, 0.9471102330827478);
Mat_<double>T = (Mat_<double>(3, 1) << //外参平移矩阵
-46.3872746240076,
-12.0702827953298,
238.561423323864);
cv::Mat imagePoint = cv::Mat::ones(3, 1, CV_64F); //像素矩阵
imagePoint.at<double>(0, 0) = u;
imagePoint.at <double>(1, 0) = v;
std::cout << imagePoint << "imagePoint" << std::endl;
Mat_<double>I = (Mat_<double>(3, 3) << //相机内参
2201.26949422718, 0, 613.845238974861,
0, 2202.10672781966, 485.708808088470,
0, 0, 1);
cv::Mat leftSideMat = cv::Mat::ones(3, 1, CV_64F);
cv::Mat rightSideMat = cv::Mat::ones(3, 1, CV_64F);
leftSideMat = R.inv() * I.inv()* imagePoint;
rightSideMat = R.inv() * T;
s = Zw + rightSideMat .at<double>(2, 0)/leftSideMat .at<double>(2, 0);
std::cout << "s=" << s << std::endl;;
//计算世界坐标
Mat wcPoint; //= Mat::ones(3, 1, CV_32F);
wcPoint = R.inv() * (s*I.inv() * imagePoint - T);
std::cout << "wcPoint =" << wcPoint << std::endl;
std::cout << "imgepoint=" << imagePoint << std::endl;
cv::Point3d worldPoint(wcPoint.at<double>(0, 0), wcPoint.at<double>(1, 0), wcPoint.at<double>(2, 0));
p = worldPoint;
std::cout << "caliworldPoint" << worldPoint<< std::endl;
return 0;
}
链接: link.
内容来源于网上+结合自己的理解,如有侵权,联系删除。