CloudCompare - M3C2 calculates the robust distance between two point clouds

insert image description here

This article was originally created by CSDN Dianyunxia, ​​and the original text link . Reptile websites please respect yourself.

M3C2 (plugin)

1 Overview

  Functions in CloudCompare 'Plugins -> M3C2 Distance 'are special methods for computing robust distances between two sets of point clouds. Starting with CloudCompare-version 2.9, the M3C2 plugin also includes a variant of the "precision map" (M3C2-pm) of James et al. (2017), for which a precision estimate for each point is already available without computation from the roughness estimate. M3C2-pm is particularly suitable for point clouds generated by photogrammetric processing.
insert image description here

2. Calculation principle

  M3C2 point cloud comparison algorithm, the steps include selecting the core point cloud, calculating the normal of the three-dimensional surface, calculating the point cloud distance, determining the confidence interval of the spatial variable, etc., as shown in Figure 1. The algorithm can directly detect the change of complex terrain on the point cloud without grid division; and the change calculation is less affected by the spatial point density, surface roughness and different sampling positions.
insert image description here

Figure 1 M3C2 algorithm principle

  Since selecting an appropriate core point cloud can significantly improve computational efficiency, the initial stage of calculation needs to downsample the original data according to the set distance to obtain a core point cloud with low density and uniform distribution, and use it as the basis for change identification (as shown in Figure 1(a)). This method can significantly improve the computational efficiency and optimize the time complexity, thereby greatly reducing the time required for calculation.
  After selecting a suitable core point cloud, for any given core point P core P_{core}Pcore, at radius D/2 D/2Within the range of D /2 , a plane can be fitted with other point cloud data in the neighborhood, and the local normal vector N of the two point clouds can be obtained fromthisN. _ As shown in Figure 1(b), record the core pointP core P_{core}PcoreRadius D / 2 D / 2The distance of all points in the range of D /2 to the best fitting plane, and using the roughnessσ ( D ) σ(D)σ ( D ) represents the size of the standard deviation, that is,
σ ( D ) = ∑ K = 1 M ( ak − a ˉ ) 2 M σ(D)=\sqrt\frac{\sum_{K=1}^M\ (a_k-\bar{a})^2}{M}σ ( D )=MK=1M (akaˉ)2

In the formula, ai a_iaiis the radius D / 2 D/2The kkthin the range of D /2The distance between k points and the best fitting plane;a ˉ \bar{a}aˉ is the best fit plane withD/2 D/2The average spacing of all point clouds within the radius of D /2 ; M MM is distributed inD / 2 D / 2The total amount of point clouds within a radius of D /2 .
  FromP core P_{core}PcoreStarting from the fitting plane where d/2 d/2d /2 is the projection radius, there is a pass throughP core P_{core}Pcore, with the normal vector N NN is the axis and the cylinder intersecting the point cloud of the two phases. Search all point clouds n 1 , n 2 n_1, n_2contained within the cylindern1n2, and calculate the average positions of the two phases of the point cloud column along the normal vector. At this time, the average positions of the two phases of the point cloud in the column are M 1 , M 2 M_1, M_2M1M2, the difference between the two average positions is the distance LM 3 C 2 L_{M3C2}LM 3 C 2, that is, the point cloud is in P core P_{core}PcoreThe distance of the point change. Iterative operation is performed on the entire point cloud until all points are traversed, and the change of the point cloud of the entire target area can be obtained, as shown in Figure 1(c). In the calculation process, whether the appropriate algorithm parameters can be set will directly affect the subsequent calculation effect. Among them, the projection radius d / 2 d/2d /2 , normal vector radiusD / 2 D/2D /2 , and the maximum calculation depthHHH is the three main parameters that affect the calculation accuracy and efficiency.
  As shown in Figure 1(d), after calculating the point cloud distance, in order to estimate the measurement accuracy of the local distance change and avoid misjudgment of change recognition caused by various errors, it is necessary to further determine the spatial confidence interval to reduce the probability of misjudgment. Under the premise that the errors of multiple measurements follow an independent Gaussian distribution,n 1 , n 2 ≥ 30 n_1, n_2\geq30n1n2At 30 o'clock , use the z-two-tailed significant difference test formula to calculate the confidence interval above the confidence level of 0.95, that is, LOD
0.95 ( d ) = ± 1.96 ( δ 1 ( d ) 2 n 1 + δ 2 ( d ) 2 n 2 + REG ) (2) LOD_{0.95}(d)=\pm1.96(\sqrt{\frac{\delta_1(d )^2}{n_1}+\frac{\delta_2(d)^2}{n_2}}+REG)\tag{2}LOD0.95(d)=±1.96(n1d1(d)2+n2d2(d)2 +REG)(2)

In the formula, n 1 , n 2 n_1, n_2n1n2for d/2 d/2The number of core points of the two phases of point clouds under d /2 projection radius; REG represents the registration error of the two phases of point clouds; LLOD 0.95 ( d ) LOD_{0.95}(d)LOD0.95( d ) is the projection radiusd / 2 d/2The minimum change distance of the confidence interval above the confidence level of 0.95 under d /2 . Among them, the calculation formula of the registration error REG of the two point clouds is
REG = ( ( RMSE 1 ) 2 + ( RMSE 2 ) 2 (3) REG=(\sqrt{(RMSE_1)^2+(RMSE_2)^2}\tag{3}REG=((RMSE1)2+(RMSE2)2
In formula ( 3 ) , RMSE 1 RMSE_1RMSE1is the root mean square error of the reference point cloud; RMSE 2 RMSE_2RMSE2is the root mean square error of the comparison point cloud.
  When 4 < n 1 , n 2 < 30 4<n_1, n_2<304<n1n2<At 30 , the confidence interval was calculated using the t-two-tailed significant difference test formula, and the degrees of freedomDF DFDF can be calculated according to the following expression DF = ( δ 1 ( d ) 2 n 1 + δ 2 ( d ) 2 n 2 ) / ( δ 1 4 / ( n 1 − 1 ) n 1 + δ 2 4 / ( n 2 − 1 ) n 2 ) (4) DF=(\frac{\delta_1(d)^2}{n_1}+\frac{\delta _2(d)^2}{n_2})/(\frac{\delta_1^4/(n_1-1)}{n_1}+\frac{\delta_2^4/(n_2-1)}{n_2})\tag {4}
DF=(n1d1(d)2+n2d2(d)2)/(n1d14/(n11)+n2d24/(n21))
In formula ( 4 ) , when n 1 , n 2 n_1, n_2n1n2When it is less than 4, no confidence interval is required.

3. Operation process

1. First select the point clouds that need to be compared in the two phases
insert image description here

2. Find the M3C2 Distance function
insert image description here

3. Set parameters
insert image description here

Main parameters

  Like the CANUPO algorithm, in order to speed up calculations, calculations can only be performed at specific points (called core points). The main idea is that while terrestrial laser scanning point clouds are usually very dense, it is not necessary (and in practice would be very slow) to measure distance at such a high density. That's why the user has to choose to use 'Core points'(there are three options at the bottom of the dialog:

  • entire point cloud

  • downsampling point

  • custom point

  • normals(Normal Scale): is the diameter of the spherical neighborhood extracted around each core point, used to compute local normals. This normal is used to orient a cylinder inside which equivalent points in other clouds are searched. Regarding normals, more advanced options can be set in the Normals tab (see below).

  • ProjectionProjection scale: is the diameter of the above cylinder;

  • max depthMaximum Depth: Corresponds to cylinder height
    Note : The larger these radii are, the less affected the local surface roughness (and noise) will be. But the calculation is slower.

  Finally, if the point cloud is stitched from multiple stations and the global registration is wrong, you can enter an appropriate parameter in the "registration error" field, which will be taken into account when calculating the confidence (assessing whether the corresponding displacement is important) for each point.

Normals
  It is very important to use accurate normal information in M3C2. The second tab is used to specify the calculation method of the normal vector.
insert image description here

  • default: The normal is calculated according to the normal scale parameter defined in the previous tab;
  • multiscale: for each core point, normals are computed at several scales, and the most 'flat' is used;
  • vertical: do not do conventional calculations, only use pure vertical normals (perfect two-dimensional problem);
  • horizontal: The normal is "restricted" on the (XY) plane

  When the point cloud itself contains normal vector information, it is not necessary to calculate the normal vector. By checking the check box on the first tab, the use cloud #1normal vector of the point cloud itself can be directly used. Additionally, 'orientation'options help the plugin to position normals correctly:

  • By specifying a global orientation (relative to a given axis or a specific point)
  • Or specify a point cloud containing all sensor locations

Precsison maps

insert image description here

  Precision mapstab allows to compute detectable changes using measured precision values ​​stored in the point cloud scalar field, instead of estimating by computing roughness. ' precision maps 'A checkbox can be used to enable variants of M3C2 if the scalar field is available . In this case, the uncertainty estimate will no longer be derived from the roughness estimate for the projected scale in the Main Parameters tab, but will be based on the 3-D point precision estimate stored in a scalar field. Make sure to select the appropriate scalar fields for both point clouds to describe the measurement precision in X, Y and Z (sigmaX, sigmaY and sigmaZ) The scale can change if the precision value is in a different unit than the point coordinates. For example, if the point coordinates and precision values ​​are in meters, the scale bar value is 1.000. However, if the coordinates are in meters, but the exact scalar field is in millimeters, the scale value should be set to 0.001.

Advanced
insert image description here

   The name of the advanced tab says it all. This option can generally be ignored.

Output
insert image description here

   There is an option to generate additional scalar fields, and also to choose on which point cloud to reproject the computed measurements, which is especially useful if you use a different core point than the first input point cloud.

output result

NOTE : Parameters can be saved (and reloaded) via a dedicated text file. Use the two icons in the lower left corner of the dialog to do this.

Calculate Distance
When you are ready, just click the "OK" button. When done, the dialog will close. The input point cloud must be hidden to view the results, as the results are generated in a new point cloud.

  Note that in addition to distance, the M3C2 plugin generates several other scalar fields:

  • Distance uncertainty (the closer to zero the better)
  • Significance of change (is it likely that the distance corresponds to an actual change)
  • As well as the standard deviation and number of neighbors for each core point (specified in the "Output" tab).

  Note also that points that do not have any corresponding points in other clouds remain "grey" (they are associated with NaN - not numeric - distances). This means that points in other clouds cannot be found within the search cylinder. So gray dots mean that either some parts of the cloud don't have an equivalent in other clouds (due to hidden parts or other holes in the dataset), or it's just that the maximum cylinder length isn't long enough! The final output can be customized to your exact needs using the color manager
  .

Guess you like

Origin blog.csdn.net/qq_36686437/article/details/131788645