PointCloudLib-1.12.1 study notes - features (Features) - estimate the surface normal in the point cloud

Surface normals are important properties of geometric surfaces and are heavily used in many fields, such as computer graphics applications, to apply corrections to light sources that produce shadows and other visual effects.

Given a geometric surface, infer the normal at a point on the surface, as normal to the surface at that point. However, since our acquired point cloud dataset represents a set of point samples on a real surface, there are two possibilities:

  • Obtain the underlying surface from the acquired point cloud dataset, using surface meshing techniques, and then from the mesh;
  • Infer surface normals directly from point cloud datasets using approximations.

This tutorial will address the latter, namely, given a point cloud dataset, directly compute the surface normal for each point in the cloud.

Introduction to Theory

Although many different methods of normal estimation exist, we will focus on this tutorial being one of the easiest, stated to follow. The problem of determining the normal of a point on a surface is approximated as a least squares plane fitting estimation problem with the surface.

Notice

For more information, including the mathematical equations for least squares problems, see  [RusuDissertation].

Therefore, the solution to estimate surface normals reduces to an analytical analysis of eigenvectors and eigenvalues ​​(or PCA - Principal Components) from the query points. More specifically, for each point \bold symbol{p}_i, we assemble the covariance matrix as follows \mathcal{C}:

 

\mathcal{C} = \frac{1}{k}\sum_{i=1}^{k}{\cdot (\boldsymbol{p}_i-\overline{\boldsymbol{p}})\cdot(\boldsymbol{p}_i-\overline{\boldsymbol{p}})^{T}}, ~\mathcal{C} \cdot \vec{​{\mathsf v}_j} = \lambda_j \cdot \vec{​{\mathsf v}_j},~ j \in \{0, 1, 2\}

where the neighborhood of kthe number of point neighbors considered represents the nearest neighbor  3D centroid and is the -th eigenvalue and -th  eigenvector of the covariance matrix. \overline{\boldsymbol{p}}\bold symbol{p}_i\lambda_jj\vec{​{\mathsf v}_j} j

To estimate the covariance matrix from a set of points in PCL, you can use:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
  // Placeholder for the 3x3 covariance matrix at each surface patch
  Eigen::Matrix3f covariance_matrix;
  // 16-bytes aligned placeholder for the XYZ centroid of a surface patch
  Own::Vector4f xyz_centroid;

  // Estimate the XYZ centroid
  compute3DCentroid (cloud, xyz_centroid);

  // Compute the 3x3 covariance matrix
  computeCovarianceMatrix (cloud, xyz_centroid, covariance_matrix);

In general, because there is no mathematical way to resolve the normal, its orientation computed by principal component analysis (PCA) as shown above is ambiguous, and the orientation is not consistent across the point cloud dataset. The figure below shows these pairs representing part of a larger dataset of kitchen environments. The right part of the figure represents the Extended Gaussian Image (EGI), also known as the normal sphere, which describes the orientation of all normals in the point cloud. Since the dataset is 2.5D, from a single point of view, normals should only exist on one half of the EGI sphere. However, due to their inconsistent orientation, they are distributed throughout the sphere.

_images/unflipped_scene1.jpg _images/unflipped_scene2.jpg _images/unflipped_sphere.jpg

 

If the views {\mathsf v}_pare actually known, then the solution to this problem is trivial. For all normals to \vec{\boldsymbol{n}}_ialways face the viewpoint, they need to satisfy the following equation:

\vec{\boldsymbol{n}}_i \cdot ({\mathsf v}_p - \boldsymbol{p}_i) > 0

The image below shows the result after all normals in the dataset. The image above is always viewpoint oriented.

_images/flipped_scene1.jpg _images/flipped_scene2.jpg _images/flipped_sphere.jpg

 

To manually reorient a given point normal in PCL, you can use:

flipNormalTowardsViewpoint (const PointT &point, float vp_x, float vp_y, float vp_z, Eigen::Vector4f &normal);

Notice

If the dataset has multiple collection viewpoints, then the above method of normal state reorientation does not work and more complex algorithms need to be implemented. Please see [Lusu Papers] for details.

Choose the right scale

As mentioned before, the surrounding point neighborhood support (also known as  K-neighborhood ) of a point's surface normal point needs to be estimated from the following formula.

The details of the nearest neighbor estimation problem raises the question of  the correct scaling factor : given a sampled point cloud dataset, what is the correct k ( given via pcl::Feature:: setKSearch ) or r (via pcl::Feature: :setRadiusSearch ) value, should be used to determine the nearest neighbor set of a point?

This issue is extremely important and is a representation that limits the automatic estimation of point features (ie, without a user-given threshold). In order to better illustrate this problem, the figure below provides a selection of smaller scales (ie small  R  or  K ) vs larger scales (ie larger  R  or  K ). The left part of the figure depicts a reasonably chosen scale factor, with estimated surface normals for both planes and the approximate verticality of the small side where the entire table is visible. However, if the scale factor is too large (right part), so that the collection of neighboring points is from an adjacent larger covering point surface, the estimated point feature representation becomes distorted, with the rotated surface normal at the two planar edges, And paint edges and suppress fine details.

_images/normals_different_radii.jpg _images/curvature_different_radii.jpg

 

Without getting into too much detail, let's say for now, that the choice has to be made in terms of a scale that determines the neighborhood of a point at the level of detail required by the application. Simply put, if the curvature of the edge between the stem and the cylindrical part is important, the scale factor needs to be small enough to capture these details, otherwise large.

estimated normal

Although an example of normal estimation has been given in the features , to better explain what's going on under the hood.

The following code snippet will estimate points in all input datasets.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
#include <pcl/point_types.h>
#include <pcl/features/normal_3d.h>

{
  pcl::PointCloud<pcl::PointXYZ>::Ptr cloud (new pcl::PointCloud<pcl::PointXYZ>);

  ... read, pass in or create a point cloud ...

  // Create the normal estimation class, and pass the input dataset to it
  pcl::NormalEstimation<pcl::PointXYZ, pcl::Normal> ne;
  ne.setInputCloud (cloud);

  // Create an empty kdtree representation, and pass it to the normal estimation object.
  // Its content will be filled inside the object, based on the given input dataset (as no other search surface is given).
  pcl::search::KdTree<pcl::PointXYZ>::Ptr tree (new pcl::search::KdTree<pcl::PointXYZ> ());
  ne.setSearchMethod (tree);

  // Output datasets
  pcl::PointCloud<pcl::Normal>::Ptr cloud_normals (new pcl::PointCloud<pcl::Normal>);

  // Use all neighbors in a sphere of radius 3cm
  ne.setRadiusSearch (0.03);

  // Compute the features
  ne.compute (*cloud_normals);

  // cloud_normals->size () should have the same size as the input cloud->size ()*
}

The actual calculation call from the NormalEstimation  class does nothing internally, but:

for each point p in cloud P

  1. get the nearest neighbors of p

  2. compute the surface normal n of p

  3. check if n is consistently oriented towards the viewpoint and flip otherwise

The viewpoint defaults to (0, 0, 0) and can be changed by:

setViewPoint (float vpx, float vpy, float vpz);

To compute a single point normal, use:

computePointNormal (const pcl::PointCloud<PointInT> &cloud, const std::vector<int> &indices, Eigen::Vector4f &plane_parameters, float &curvature);

where cloud is the input point cloud containing the points, index represents the set of k-nearest neighbors in the cloud , plane_parameters curvature represents the output of normal estimation, plane_parameters holds the normal (nx, ny, nz) on the first 3 coordinates, and The four coordinates are D = nc. p_plane (centroid here) + p. The output surface curvature is estimated as the relationship between the eigenvalues ​​of the covariance matrix (shown above), as follows:

 

\sigma = \frac{\lambda_0}{\lambda_0 + \lambda_1 + \lambda_2}

Speeding up normal estimation with OpenMP

For speed-savvy users, PCL provides an additional implementation of Surface using OpenMP to speed up calculations using the normal estimate of the multi-core/multi-thread paradigm. The name of the class is  pcl::NormalEstimationOMP , and its API is identical to the single-threaded  pcl::NormalEstimation , making it suitable as a drop-in replacement. On a system with 100 cores, you should get anything between 8-6 times faster compute time.

Notice

If your dataset is organized (e.g. acquired with a TOF camera, stereo camera, etc. - that is, it has a width and a height), for faster results, see Normal Estimation Using Integral Images .

Guess you like

Origin blog.csdn.net/linux_huangyu/article/details/131975755