(02)Cartographer源码无死角解析-(39) PoseExtrapolator→ExtrapolatePose()、ExtrapolatePosesWithGravity() 等成员函数

讲解关于slam一系列文章汇总链接:史上最全slam从零开始,针对于本栏目讲解(02)Cartographer源码无死角解析-链接如下:
(02)Cartographer源码无死角解析- (00)目录_最新无死角讲解:https://blog.csdn.net/weixin_43013761/article/details/127350885
 
文末正下方中心提供了本人 联系方式, 点击本人照片即可显示 W X → 官方认证 {\color{blue}{文末正下方中心}提供了本人 \color{red} 联系方式,\color{blue}点击本人照片即可显示WX→官方认证} 文末正下方中心提供了本人联系方式,点击本人照片即可显示WX官方认证
 

一、前言

通过前面的博客,已经对 src/cartographer/cartographer/mapping/pose_extrapolator.cc 文件中的如下函数进行了讲解:

void PoseExtrapolator::AddPose()  //添加pose数据
void PoseExtrapolator::AddImuData() //添加Imu数据
void PoseExtrapolator::AddOdometryData() //添加里程计数据

这个三种数据,都包含姿态与位置信息。首先能够想到的问题是,来自于这么多种不同类型的数据,但是都是表述姿态的信息,应该以那个为助,那个为辅呢?这些都是后话,后续再进行详细的讨论,这里主要提及一下,源码中应该是以 Imu 获取的位姿为基准的。

了解了如上三个函数之后,还有身下的函数没有进行分析,如下所示:

// 预测得到time时刻 tracking frame 在 local 坐标系下的位姿
transform::Rigid3d PoseExtrapolator::ExtrapolatePose(const common::Time time){
    
    ......}

// 预测得到time时刻 tracking frame 在 local 坐标系下的姿态
Eigen::Quaterniond PoseExtrapolator::EstimateGravityOrientation(const common::Time time) {
    
    ......}

void PoseExtrapolator::AdvanceImuTracker(const common::Time time, ImuTracker* const imu_tracker){
    
    ......}

Eigen::Quaterniond PoseExtrapolator::ExtrapolateRotation(const common::Time time, ImuTracker* const imu_tracker){
    
    ......}

PoseExtrapolator::ExtrapolateTranslation(common::Time time){
    
    ......}

// 获取一段时间内的预测位姿的结果
PoseExtrapolator::ExtrapolationResult PoseExtrapolator::ExtrapolatePosesWithGravity(const std::vector<common::Time>& times) {
    
    ......}

看起来比较吓人奥,怎么还有这么多的函数没有分析呢?起始这些函数又很多相同的地方,理解了一两个之后,剩下的理解起来就比较简单。另外,他们之间还存在相互调用的关系。
 

二、ExtrapolatePose()

E x t r a p o l a t e P o s e ( ) = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = : {\color{Purple} ExtrapolatePose()======================================================================================================================================================}: ExtrapolatePose()======================================================================================================================================================:

功能 : {\color{Purple} 功能}: 功能: 指定一个时间点,然后返回该时间机器人(tracking frame) 在local 坐标系下的位姿。

输入 : {\color{Purple} 输入}: 输入: 【参数①time】→指定一个时间点。

返回 : {\color{Purple} 返回}: 返回:【transform::Rigid3d】 返回指定时间点,机器人(tracking frame) 在local 坐标系下的位姿。
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = : {\color{Purple} ================================================================================================================================================================}: ================================================================================================================================================================:

 
该函数的主要流程如下:

( 1 ) : \color{blue} (1): (1): 获得 timed_pose_queue_ 队列中最后的一个数据,然后检查传入的时间 time 参数,是否比最后一个数据时间点大,如果不是则报错。

扫描二维码关注公众号,回复: 14551024 查看本文章

( 2 ) : \color{blue} (2): (2): 判断 time 时间点的位姿是否已经推断过,如果已经推断过,则无需再次进行推断,直接返回缓存在 cached_extrapolated_pose_ 的位姿即可,否则需要进行位姿推断。

( 3 ) : \color{blue} (3): (3): 平移预测→首先调用 ExtrapolateTranslation() 函数,预测 time 时间点相对于 newest_timed_pose.time 时间点的平移,然后再在加上 newest_timed_pose.time 时间点的位移。

( 4 ) : \color{blue} (4): (4): 旋转预测→首先调用 ExtrapolateRotation() 函数,预测 time 时间点相对于 newest_timed_pose.pose 时间点的旋转,然后在左乘 newest_timed_pose.pose 时间点的旋转。调用 ExtrapolateRotation() 函数时,除 time 参数外,还传入了参数 extrapolation_imu_tracker_.get(),后面会进行兴详细讲解。

( 5 ) : \color{blue} (5): (5): 预测出来的 translation 与 rotation 存储于 cached_extrapolated_pose_ 中并返回。

代码注释如下:

// 预测得到time时刻 tracking frame 在 local 坐标系下的位姿
transform::Rigid3d PoseExtrapolator::ExtrapolatePose(const common::Time time) {
    
    
  const TimedPose& newest_timed_pose = timed_pose_queue_.back();
  CHECK_GE(time, newest_timed_pose.time);
  // 如果本次预测时间与上次计算时间相同 就不再重复计算
  if (cached_extrapolated_pose_.time != time) {
    
    
    // 预测tracking frame在local坐标系下time时刻的位置
    const Eigen::Vector3d translation =
        ExtrapolateTranslation(time) + newest_timed_pose.pose.translation();
    // 预测tracking frame在local坐标系下time时刻的姿态
    const Eigen::Quaterniond rotation =
        newest_timed_pose.pose.rotation() *
        ExtrapolateRotation(time, extrapolation_imu_tracker_.get());
    cached_extrapolated_pose_ =
        TimedPose{
    
    time, transform::Rigid3d{
    
    translation, rotation}};
  }
  return cached_extrapolated_pose_.pose;
}

三、ExtrapolateTranslation()

从上面可以了解到,ExtrapolatePose() 并没有直接进行位姿的推断,而是调用了 ExtrapolateTranslation() 与 ExtrapolateRotation(),首先来看看 ExtrapolateTranslation() 函数。

E x t r a p o l a t e T r a n s l a t i o n ( ) = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = : {\color{Purple} ExtrapolateTranslation()======================================================================================================================================================}: ExtrapolateTranslation()======================================================================================================================================================:

功能 : {\color{Purple} 功能}: 功能: 指定一个时间点,然后返回该时间点,机器人(tracking frame) 相对于 timed_pose_queue_ 中最后一个数据时间点的平移,这里是基于local坐标系。

输入 : {\color{Purple} 输入}: 输入: 【参数①time】→指定一个时间点。

返回 : {\color{Purple} 返回}: 返回:【Eigen::Vector3】 返回指定时间点,机器人相对于 timed_pose_queue_ 最后一个数据平平移。
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = : {\color{Purple} =================================================================================================================================================================}: =================================================================================================================================================================:

 
代码逻辑:简单的说,就是使用 newest_timed_pose.time 到 time 的时间段,然后乘以线速度即可。不过呢?有两种线速度进行选择:①来自于点云匹配的pose;②来自于里程计的。从代码层面来分析,如果使用了 odome 里程计,则会优先使用其平移。也就是说,认为 odome 的平移更加精准。

代码注释如下,公式比较简单,就不再进行推导了,就是单纯的时间乘以速度等于位移而已、

// 返回从最后一个位姿的时间 到time时刻 的tracking frame在local坐标系下的平移量
Eigen::Vector3d PoseExtrapolator::ExtrapolateTranslation(common::Time time) {
    
    
  const TimedPose& newest_timed_pose = timed_pose_queue_.back();
  const double extrapolation_delta =
      common::ToSeconds(time - newest_timed_pose.time);
      
  // 使用tracking frame 在 local坐标系下的线速度 乘以时间 得到平移量的预测
  if (odometry_data_.size() < 2) {
    
    
    return extrapolation_delta * linear_velocity_from_poses_;
  }
  // 如果不使用里程计就使用通过pose计算出的线速度
  return extrapolation_delta * linear_velocity_from_odometry_;
}

四、ExtrapolateRotation()

在 ExtrapolatePose() 函数中调用 ExtrapolateRotation() 的代码如下:

    const Eigen::Quaterniond rotation = newest_timed_pose.pose.rotation() * ExtrapolateRotation(time, extrapolation_imu_tracker_.get());

可知其传入给ExtrapolateRotation()的的实参为 extrapolation_imu_tracker_.get() 裸指针,通过前面对 PoseExtrapolator::AddPose() 函数的讲解。如下代码:

  // 用于根据里程计数据计算线速度时姿态的预测
  odometry_imu_tracker_ = absl::make_unique<ImuTracker>(*imu_tracker_);
  // 用于位姿预测时的姿态预测
  extrapolation_imu_tracker_ = absl::make_unique<ImuTracker>(*imu_tracker_);

也就是说,odometry_imu_tracker_ 与 extrapolation_imu_tracker_ 都是来自于 imu_tracker_ 对象的拷贝。
 
E x t r a p o l a t e R o t a t i o n ( ) = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = : {\color{Purple} ExtrapolateRotation()======================================================================================================================================================}: ExtrapolateRotation()======================================================================================================================================================:

功能 : {\color{Purple} 功能}: 功能: 指定一个时间点,返回该时间相对于 imu_tracker 最新姿态的旋转变化

输入 : {\color{Purple} 输入}: 输入: 【参数①time】→指定一个时间点。【参数②imu_tracker】→用于姿态推断的ImuTracker对象

返回 : {\color{Purple} 返回}: 返回:【Eigen::Quaterniond】 time 时间点机器人相对于 imu_tracker 最新位姿的旋转变化。
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = : {\color{Purple} ==================================================================================================================================================================}: ==================================================================================================================================================================:

该函数的代码逻辑比较简单

( 1 ) : \color{blue} (1): (1): 首先调用 PoseExtrapolator::AdvanceImuTracker() 函数,通过 imu_tracker 预测 time 时刻的姿态,这里记录为: o r i e n t a t i o n t i m e i n i t \mathbf {orientation}^{init}_{time} orientationtimeinit

( 2 ) : \color{blue} (2): (2): 通过 imu_tracker_->orientation() 获取上一次校准时的姿态,这里记为 o r i e n t a t i o n t i m e _ i n i t \mathbf {orientation}^{init}_{time\_} orientationtime_init

( 3 ) : \color{blue} (3): (3): 通过 o r i e n t a t i o n t i m e i n i t \mathbf {orientation}^{init}_{time} orientationtimeinit o r i e n t a t i o n t i m e _ i n i t \mathbf {orientation}^{init}_{time\_} orientationtime_init 即可推导出 time_ 到 time 期间的姿态(方向)变化,数学矩阵形式可以表示如下: r o t a t i o n t i m e _ t i m e = [ o r i e n t a t i o n t i m e i n i t ] − 1 ∗ o r i e n t a t i o n t i m e _ i n i t (01) \color{Green} \tag{01} \mathbf {rotation}^{time}_{time\_}=[\mathbf {orientation}^{init}_{time}]^{-1}*\mathbf {orientation}^{init}_{time\_} rotationtime_time=[orientationtimeinit]1orientationtime_init(01)

代码注释如下:

// 计算从imu_tracker到time时刻的姿态变化量
Eigen::Quaterniond PoseExtrapolator::ExtrapolateRotation(
    const common::Time time, ImuTracker* const imu_tracker) const {
    
    
  CHECK_GE(time, imu_tracker->time());

  // 更新imu_tracker的状态到time时刻
  AdvanceImuTracker(time, imu_tracker);

  // 通过imu_tracker_获取上一次位姿校准时的姿态
  const Eigen::Quaterniond last_orientation = imu_tracker_->orientation();
  // 求取上一帧到当前时刻预测出的姿态变化量:上一帧姿态四元数的逆 乘以 当前时刻预测出来的姿态四元数
  return last_orientation.inverse() * imu_tracker->orientation();
}

五、AdvanceImuTracker()

在上面的 ExtrapolateRotation() 函数中调用了 AdvanceImuTracker(),其主要功能是把形参 imu_tracker 的状态更新到 time 时刻,或者说预测 time 时刻姿态。

A d v a n c e I m u T r a c k e r ( ) = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = : {\color{Purple} AdvanceImuTracker()======================================================================================================================================================}: AdvanceImuTracker()======================================================================================================================================================:

功能 : {\color{Purple} 功能}: 功能: 指定一个时间点,使用 imu_tracker 推测该时间点姿态,然后返回

输入 : {\color{Purple} 输入}: 输入: 【参数①time】→指定一个时间点。【参数②imu_tracker】→用于姿态推断的ImuTracker对象

返回 : {\color{Purple} 返回}: 返回:【Eigen::Quaterniond】 imu_tracker 推断出 time 时刻的位姿
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = : {\color{Purple} ==================================================================================================================================================================}: ==================================================================================================================================================================:

 
该函数实际上是比较复杂的,不过经过上一篇博客对 ImuTracker 的分析,现在分析起来就比较简单了。

( 1 ) : \color{blue} (1): (1): 如果目前 imu_data_ 的数据为空,或者 time 时间点小于 imu_data_ 第一个数据的时间点。那么此时会利用来自于 poses 或者 odometry 的 angular_velocity_from_poses_ 与 angular_velocity_from_odometry_ 数据对 imu_tracker 进行更新,然后直接返回。该更新主要执行如下3个步骤。

( 1.1 ) : \color{blue} (1.1): (1.1): 调用 imu_tracker->Advance(time) 函数,预测当前时刻的姿态与重力方向。该函数预测主要使用到 ImuTracker 中的 orientation_ 与 gravity_vector_ 成员变量,这两个变量在构造函数中就进行了初始化。

( 1.2 ) : \color{blue} (1.2): (1.2): 调用 imu_tracker->AddImuLinearAccelerationObservation() ,添加Imu线性加速度数据。不过此时Imu数据都没有,那么 Imu 线性加速度当然也没有,所以通过 Eigen::Vector3d::UnitZ() 模拟了一个重力加速度。

( 1.3 ) : \color{blue} (1.3): (1.3): 调用imu_tracker->AddImuAngularVelocityObservation() 对角速度更新,不过同上面一样,使用的是来自于 poses 或这 odometry 的角速度。这里是以 odometry 里程计的角速度优先的。

( 2 ) : \color{blue} (2): (2): 如果目前 imu_data_ 的数据不为空, 则判断形参 imu_tracker->time() 是否小于 imu_data_ 第一个数据的时间。如果小于,那么就把 imu_tracker 的状态更新到 imu_data_ 第一个数据时间的姿态。

( 4 ) : \color{blue} (4): (4): 调用 std::lower_bound 函数,在 imu_data_ 队列中找到时间小于 time 的第一个数据,迭代器it指向该数据。

( 5 ) : \color{blue} (5): (5): 使用while循环遍历 imu_data_ 中所有时间低于 time 的数据,然后使用 imu_tracker 对所有数据的时间位姿进行预测,每次预测之后都会使用 Imu 数据的加速度与角速度对预测出来的姿态进行校准。

( 6 ) : \color{blue} (6): (6): 最后返回 imu_tracker 对 time 时刻位姿的预测结果。

总结 : \color{red} 总结: 总结: 该函数主要利用 imu_data_ 所有数据,对 imu_tracker 进行校正。然后再用校正好的 imu_tracker 预测 time 时刻的姿态。代码注释如下:

/**
 * @brief 更新imu_tracker的状态, 并将imu_tracker的状态预测到time时刻
 * 
 * @param[in] time 要预测到的时刻
 * @param[in] imu_tracker 给定的先验状态
 */
void PoseExtrapolator::AdvanceImuTracker(const common::Time time,
                                         ImuTracker* const imu_tracker) const {
    
    
  // 检查指定时间是否大于等于 ImuTracker 的时间
  CHECK_GE(time, imu_tracker->time());

  // 不使用imu 或者 预测时间之前没有imu数据的情况
  if (imu_data_.empty() || time < imu_data_.front().time) {
    
    
    // There is no IMU data until 'time', so we advance the ImuTracker and use
    // the angular velocities from poses and fake gravity to help 2D stability.
    // 在time之前没有IMU数据, 因此我们改进了ImuTracker, 并使用姿势和假重力产生的角速度来帮助2D稳定
    
    // 预测当前时刻的姿态与重力方向
    imu_tracker->Advance(time);
    // 使用 假的重力数据对加速度的测量进行更新
    imu_tracker->AddImuLinearAccelerationObservation(Eigen::Vector3d::UnitZ());
    // 只能依靠其他方式得到的角速度进行测量值的更新
    imu_tracker->AddImuAngularVelocityObservation(
        odometry_data_.size() < 2 ? angular_velocity_from_poses_
                                  : angular_velocity_from_odometry_);
    return;
  }
 
  // imu_tracker的时间比imu数据队列中第一个数据的时间早, 就先预测到imu数据队列中第一个数据的时间
  if (imu_tracker->time() < imu_data_.front().time) {
    
    
    // Advance to the beginning of 'imu_data_'.
    imu_tracker->Advance(imu_data_.front().time);
  }

  // c++11: std::lower_bound() 是在区间内找到第一个大于等于 value 的值的位置并返回, 如果没找到就返回 end() 位置
  // 在第四个参数位置可以自定义比较规则,在区域内查找第一个 **不符合** comp 规则的元素

  // 在imu数据队列中找到第一个时间上 大于等于 imu_tracker->time() 的数据的索引
  auto it = std::lower_bound(
      imu_data_.begin(), imu_data_.end(), imu_tracker->time(),
      [](const sensor::ImuData& imu_data, const common::Time& time) {
    
    
        return imu_data.time < time;
      });

  // 然后依次对imu数据进行预测, 以及添加观测, 直到imu_data_的时间大于等于time截止
  while (it != imu_data_.end() && it->time < time) {
    
    
    // 预测出当前时刻的姿态与重力方向
    imu_tracker->Advance(it->time);
    // 根据线速度的观测,更新重力的方向,并根据重力的方向对上一时刻预测的姿态进行校准
    imu_tracker->AddImuLinearAccelerationObservation(it->linear_acceleration);
    // 更新角速度观测
    imu_tracker->AddImuAngularVelocityObservation(it->angular_velocity);
    ++it;
  }
  // 最后将imu_tracker的状态预测到time时刻
  imu_tracker->Advance(time);
}

六、EstimateGravityOrientation()

到目前为止,对于 ExtrapolatePose() 函数及其内部调用到的函数都讲解完成了,下面来看看 EstimateGravityOrientation() 函数。

E s t i m a t e G r a v i t y O r i e n t a t i o n ( ) = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = : {\color{Purple} EstimateGravityOrientation()======================================================================================================================================================}: EstimateGravityOrientation()======================================================================================================================================================:

功能 : {\color{Purple} 功能}: 功能: 指定一个时间点,返回该时间相对于 imu_tracker_ 预测出的 time 时刻姿态(推断出来的重力方向)。

输入 : {\color{Purple} 输入}: 输入: 【参数①time】→指定一个时间点。

返回 : {\color{Purple} 返回}: 返回:【Eigen::Quaterniond】 预测的 time 时刻姿态(推断出来的重力方向)。
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = : {\color{Purple} ======================================================================================================================================================================}: ======================================================================================================================================================================:

其与 ExtrapolateRotation() 十分相似,ExtrapolateRotation() 预测的是变化量,且能够通过参数指定 ImuTracker对象, EstimateGravityOrientation() 预测的是 time 时刻的位姿,固定使用 imu_tracker_ 进行预测。代码比较简单,注释如下,

// 预测得到time时刻 tracking frame 在 local 坐标系下的姿态
Eigen::Quaterniond PoseExtrapolator::EstimateGravityOrientation(
    const common::Time time) {
    
    
  ImuTracker imu_tracker = *imu_tracker_;
  // 使得 imu_tracker 预测到time时刻
  AdvanceImuTracker(time, &imu_tracker);
  // 返回 imu_tracker 预测到的time时刻 的姿态
  return imu_tracker.orientation();
}

七、ExtrapolatePosesWithGravity()

E x t r a p o l a t e P o s e s W i t h G r a v i t y ( ) = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = : {\color{Purple} ExtrapolatePosesWithGravity()======================================================================================================================================================}: ExtrapolatePosesWithGravity()======================================================================================================================================================:

功能 : {\color{Purple} 功能}: 功能: 指定一个n个时间点,返回 PoseExtrapolator::ExtrapolationResult 结构体,其包含了 n 个时间点位姿信息,以及当前的线速度,及估算出来的重力方向。

输入 : {\color{Purple} 输入}: 输入: 【参数①times】→ vector形式存储的多个时间点

返回 : {\color{Purple} 返回}: 返回:【PoseExtrapolator::ExtrapolationResult】含n个时间点,推测出来的位姿,以及当前线速度,及估算出来的重力方向。
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = : {\color{Purple} ====================================================================================================================================================================}: ====================================================================================================================================================================:

该函数比较简单,就是调用前面讲解过的函数,其返回的线速度是以里程计为主的,也就是说其认为里程计的线速度比来自于 poses 的线速度更准。

代码注释如下:

// 获取一段时间内的预测位姿的结果
PoseExtrapolator::ExtrapolationResult
PoseExtrapolator::ExtrapolatePosesWithGravity(
    const std::vector<common::Time>& times) {
    
    
  std::vector<transform::Rigid3f> poses;

  // c++11: std::prev 获取一个距离指定迭代器 n 个元素的迭代器,而不改变输入迭代器的值
  // 默认 n 为1,当 n 为正数时, 其返回的迭代器将位于 it 左侧;
  // 反之, 当 n 为负数时, 其返回的迭代器位于 it 右侧

  // 获取 [0, n-1] 范围的预测位姿
  for (auto it = times.begin(); it != std::prev(times.end()); ++it) {
    
    
    poses.push_back(ExtrapolatePose(*it).cast<float>());
  }

  // 进行当前线速度的预测
  const Eigen::Vector3d current_velocity = odometry_data_.size() < 2
                                               ? linear_velocity_from_poses_
                                               : linear_velocity_from_odometry_;
  return ExtrapolationResult{
    
    poses, ExtrapolatePose(times.back()),
                             current_velocity,
                             EstimateGravityOrientation(times.back())};
}

八、结语

到目为止对于 PoseExtrapolator 的成员函数,可以说是讲解完成了。但是对于其具体的用法和作用还是比较懵逼的。比如说,最前面提到的:

void PoseExtrapolator::AddPose()  //添加pose数据
void PoseExtrapolator::AddImuData() //添加Imu数据
void PoseExtrapolator::AddOdometryData() //添加里程计数据

三个函数,可以往推测器中添加各种数据,那么其又是如何取舍,或者以谁为主的呢?这些疑问,在下一篇博客再进行揭秘。

 
 
 

猜你喜欢

转载自blog.csdn.net/weixin_43013761/article/details/128362436
今日推荐