深度图转点云遇到的错误 获取cv::Mat像素值的时候注意数据类型

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/Cyril__Li/article/details/79047202

最近写了一个根据Depth图像和相机参数计算某一点三维坐标和平均表面法向量的ROS Service,但是发现计算得到的表面法向量经常是nan,最后找到原因是从深度图提取数值的时候把数据类型搞错了。

//比如我有一个cv::Mat B,编码为CV_32FC1
// 注意这里CV_32F的F意味着元素为float,因此:
value = B.at<float>(i,j);
value = B.ptr<float>(i)[j];

//假如有cv::Mat C,编码为CV_8UC1
// 注意这里CV_8U的U意味着元素为unsigned char,因此:
value = B.at<uchar>(i,j);
value = B.ptr<uchar>(i)[j];

我就是这里搞混了,得到的深度值非常大,因此计算表面法向量时得到了很多nan值,获取深度值并转换为点云的代码如下:

void GraspPoseConverter::getSurfNorm3D(){
  /*
    First convert all points in the image to 3D coordinates,
    and then compute their surface normals.
  */
  pcl::PointCloud<pcl::PointXYZ>::Ptr cloud (new pcl::PointCloud<pcl::PointXYZ>);
  // loop over the whole depth map
  for (int m = 0; m < minDepthPointRegion.rows; m++){
    for (int n = 0; n < minDepthPointRegion.cols; n++){
      // get the value at (m,n) in the depth image
      double d = (double) minDepthPointRegion.at<float>(m, n);
      std::cout<<"d is: "<<d<<std::endl;
      // if d has no value, skip
      if (d == 0){
        continue;
      }
      // if d has value, add a point
      pcl::PointXYZ p;

      // compute the 3D coordinate of this point
      p.z = d / camera_factor;
      p.x = (n - camera_cx) * p.z / camera_fx;
      p.y = (m - camera_cy) * p.z / camera_fy;
      std::cout<<"pz is: "<<p.z<<std::endl;
      std::cout<<"px is: "<<p.x<<std::endl;
      std::cout<<"py is: "<<p.y<<std::endl;
      // add p into cloud
      cloud->points.push_back(p);
    }
  }
  // set some params
  cloud->height = 1;
  cloud->width = cloud->points.size();
  cloud->is_dense = false;
  std::cout<<"cloud contains num of points: "<<cloud->points.size()<<std::endl;

猜你喜欢

转载自blog.csdn.net/Cyril__Li/article/details/79047202