3. How to implement dense mapping in ORB-SLAM3 (1): before entering the tracking thread

Here I will tell you about my code as an example. Interested students can ask me for the code

Table of contents

1 How to enable dense mapping

1.1 Incoming parameters control pure positioning/dense mapping/normal SLAM mode

1.2 System constructor System::System

2 Structure of dense mapping thread

2.1 The structure of tracking threads

2.2 ImageGrabber::GrabRGBD


1 How to enable dense mapping

1.1 Incoming parameters control pure positioning/dense mapping/normal SLAM mode

        Since my code relies on dense mapping for navigation, the parameters for program execution have two additional input variables:

/**
 * @brief 主函数
 * @param argc
 * @param argv 指定参数 3 稠密建图  4 纯定位模式
 * @return
 */
int main(int argc, char **argv)
{
    ros::init(argc, argv, "RGBD");
    ros::start();
    bool mode1 = false;
    bool mode2 = false;
    // 稠密建图
    std::string dense(argv[3]);
    // 纯定位模式
    std::string purelocation(argv[4]);

    if(dense == "true")
        mode1 = true;
    else
        mode1 = false;
    if(purelocation == "true")
        mode2 = true;
    else
        mode2 = false;
    // Create SLAM system. It initializes all system threads and gets ready to process frames.
    ORB_SLAM3::System SLAM(argv[1],argv[2],mode1, mode2, ORB_SLAM3::System::RGBD,true);

    ImageGrabber igb(&SLAM);

    ros::NodeHandle nh;
    // pub track_image
    image_transport::ImageTransport image_transport(nh);
    message_filters::Subscriber<sensor_msgs::Image> rgb_sub(nh, "/camera/rgb/image_raw", 100);
    message_filters::Subscriber<sensor_msgs::Image> depth_sub(nh, "/camera/depth/image_raw", 100);
    typedef message_filters::sync_policies::ApproximateTime<sensor_msgs::Image, sensor_msgs::Image> sync_pol;
    message_filters::Synchronizer<sync_pol> sync(sync_pol(10), rgb_sub,depth_sub);
    sync.registerCallback(boost::bind(&ImageGrabber::GrabRGBD,&igb,_1,_2));


    CamPose_Pub = nh.advertise<geometry_msgs::PoseStamped>("/Camera_Pose",100);
    Camodom_Pub = nh.advertise<geometry_msgs::PoseWithCovarianceStamped>("/Camera_Odom", 100);
    tracking_img_pub = image_transport.advertise( "/ORB_SLAM3/tracking_image", 1);
    tracked_mappoints_pub = nh.advertise<sensor_msgs::PointCloud2>( "/ORB_SLAM3/tracked_points", 1);
    all_mappoints_pub = nh.advertise<sensor_msgs::PointCloud2>( "ORB_SLAM3/all_points", 1);
    ros::ServiceServer service = nh.advertiseService("deactivate_localization", deactivateLocalizationCallback);
    ros::spin();

    // Stop all threads
    SLAM.Shutdown();
    SLAM.SaveTrajectoryTUM("FrameTrajectory.txt");
    SLAM.SaveKeyFrameTrajectoryTUM("KeyFrameTrajectory.txt");
    // Save camera trajectory
    ros::shutdown();

    return 0;
}

        If the input parameter 3 is "true", it is dense mapping, and if the input parameter 4 is "true", it is pure positioning.

        ​​​​Here we pass in the command: indicating entering dense mapping mode

rosrun akm_location Akm_RGBD Vocabulary/ORBvoc.bin Examples/RGB-D/akm_slam.yaml true false

        ​​​​At this time, our mode1 is true and mode2 is false.

        We enter the constructor of SLAMSection 1.2

        Now let’s look at thisImageGrabberthis class:

class ImageGrabber
{
public:
    ros::NodeHandle nh;
    ros::Publisher pub_camerapath;
    nav_msgs::Path  camerapath;

    ImageGrabber(ORB_SLAM3::System* pSLAM):mpSLAM(pSLAM),nh("~")
    {
        pub_camerapath= nh.advertise<nav_msgs::Path> ("Path", 100);
    }

    void GrabRGBD(const sensor_msgs::ImageConstPtr& msgRGB,const sensor_msgs::ImageConstPtr& msgD);

    ORB_SLAM3::System* mpSLAM;
};
  1. Adult change amount

    • nh: A handle to a ROS node, used to communicate with the ROS system.
    • pub_camerapath: A ROS publisher (Publisher), used to publish messages of type nav_msgs::Path.
    • camerapath: A path information of type nav_msgs::Path.
  2. Construction function:

    • ImageGrabber(ORB_SLAM3::System* pSLAM): The constructor accepts a pointer to the ORB_SLAM3::System object pSLAM as a parameter. Some initialization operations are completed in the constructor:
      • mpSLAM(pSLAM): Assign the value of the constructor parameter pSLAM to the class member variable mpSLAM.
      • nh("~"): Use ROS node handle for initialization, tilde "~" indicates use of private namespace.
      • pub_camerapath= nh.advertise<nav_msgs::Path> ("Path", 100): Initializespub_camerapath, setting it as a publisher that publishes messages of type named "Path", The buffer size is 100. nav_msgs::Path
  3. Joint function:

    • void GrabRGBD(const sensor_msgs::ImageConstPtr& msgRGB, const sensor_msgs::ImageConstPtr& msgD): This is an undefined function that seems intended to receive image data from RGB and depth sensors.
  4. Adult change amount

    • mpSLAM: A pointer to an object of type ORB_SLAM3::System.

        The purpose of this code isCreate a class that can communicate with ROS to receive RGB and depth image data and process these data using the ORB_SLAM3 system. pub_camerapath in the class can be used to publish camera path information .       

        I wrote the GrabRGBD function analysis in Section 2.2.

    image_transport::ImageTransport image_transport(nh);
    message_filters::Subscriber<sensor_msgs::Image> rgb_sub(nh, "/camera/rgb/image_raw", 100);
    message_filters::Subscriber<sensor_msgs::Image> depth_sub(nh, "/camera/depth/image_raw", 100);
    typedef message_filters::sync_policies::ApproximateTime<sensor_msgs::Image, sensor_msgs::Image> sync_pol;
    message_filters::Synchronizer<sync_pol> sync(sync_pol(10), rgb_sub,depth_sub);
    sync.registerCallback(boost::bind(&ImageGrabber::GrabRGBD,&igb,_1,_2));

        I did a time synchronization here (synchronization of RGB images and deep images), and send images into GraBRGBD Function to track .

    CamPose_Pub = nh.advertise<geometry_msgs::PoseStamped>("/Camera_Pose",100);
    Camodom_Pub = nh.advertise<geometry_msgs::PoseWithCovarianceStamped>("/Camera_Odom", 100);
    tracking_img_pub = image_transport.advertise( "/ORB_SLAM3/tracking_image", 1);
    tracked_mappoints_pub = nh.advertise<sensor_msgs::PointCloud2>( "/ORB_SLAM3/tracked_points", 1);
    all_mappoints_pub = nh.advertise<sensor_msgs::PointCloud2>( "ORB_SLAM3/all_points", 1);

        Some information has been posted here.

        The camera’s odometer data, display data, currently tracking map point data, and all map point data.

        ​​​​​Finally the system shuts down:

    // Stop all threads
    SLAM.Shutdown();
    SLAM.SaveTrajectoryTUM("FrameTrajectory.txt");
    SLAM.SaveKeyFrameTrajectoryTUM("KeyFrameTrajectory.txt");
    // Save camera trajectory
    ros::shutdown();

1.2 System constructor System::System

        Here, as ORBSLAM3, no longer explain it, just say that the changes are changed ~

        ​ ​ 1. Constructor:

System::System(const string &strVocFile, const string &strSettingsFile, const bool mode1, const bool mode2, const eSensor sensor,
               const bool bUseViewer, const int initFr, const string &strSequence):
    mSensor(sensor), mpViewer(static_cast<Viewer*>(NULL)), mbReset(false), mbResetActiveMap(false),
    mbActivateLocalizationMode(false), mbDeactivateLocalizationMode(false), mbShutDown(false)

        ​ ​ ​ Two flag bits are added here: const bool mode1, const bool mode2 represents the flag of whether to perform dense mapping/relocation.

        ​ ​ 2. Read the previously created map (the first and second blogs here describe how to load and save the created map)

    // 查看配置文件版本,不同版本有不同处理方法
    cv::FileNode node = fsSettings["File.version"];
    if(!node.empty() && node.isString() && node.string() == "1.0")
    {
        settings_ = new Settings(strSettingsFile,mSensor);

        // 保存及加载地图的名字
        mStrLoadAtlasFromFile = settings_->atlasLoadFile();
        mStrSaveAtlasToFile = settings_->atlasSaveFile();

        cout << (*settings_) << endl;
    }

        ​ ​ 3. Start the dense mapping thread:

    if(mode1)
    {
        densemapping = true;
        mpPointCloudMapping = make_shared<PointCloudMapping> (0.01);
        cout << "选择为进行ROS下RGBD稠密重建!正在运行,请等待!"  << endl;
    }
    else
    {
        densemapping = false;
        cout << "选择为不进行稠密重建!正在运行,请等待!" << endl;
    }

        If mode1 is true, set thedensemapping flag to true. The pointer to the dense mapping thread is establishedmpPointCloudMapping.

        If you perform dense mapping, perform the following parts:

    if(densemapping)
    {
        //Initialize the Tracking thread
        //(it will live in the main thread of execution, the one that called this constructor)
        // 创建跟踪线程(主线程),不会立刻开启,会在对图像和imu预处理后在main主线程种执行
        cout << "Seq. Name: " << strSequence << endl;
        mpTracker = new Tracking(this,
                                 mpVocabulary,
                                 mpFrameDrawer,
                                 mpMapDrawer,
                                 mpAtlas,
                                 mpKeyFrameDatabase,
                                 strSettingsFile,
                                 mSensor,
                                 mpPointCloudMapping,
                                 settings_,
                                 strSequence);
        //Initialize the Local Mapping thread and launch
        //创建并开启local mapping线程
        mpLocalMapper = new LocalMapping(this, mpAtlas, mSensor==MONOCULAR || mSensor==IMU_MONOCULAR,
                                         mSensor==IMU_MONOCULAR || mSensor==IMU_STEREO || mSensor==IMU_RGBD, strSequence);
        mptLocalMapping = new thread(&ORB_SLAM3::LocalMapping::Run,mpLocalMapper);
        mpLocalMapper->mInitFr = initFr;

        // 设置最远3D地图点的深度值,如果超过阈值,说明可能三角化不太准确,丢弃
        if(settings_)
            mpLocalMapper->mThFarPoints = settings_->thFarPoints();
        else
            mpLocalMapper->mThFarPoints = fsSettings["thFarPoints"];
        // ? 这里有个疑问,C++中浮点型跟0比较是否用精确?
        if(mpLocalMapper->mThFarPoints!=0)
        {
            cout << "Discard points further than " << mpLocalMapper->mThFarPoints << " m from current camera" << endl;
            mpLocalMapper->mbFarPoints = true;
        }
        else
            mpLocalMapper->mbFarPoints = false;

        //Initialize the Loop Closing thread and launch
        // mSensor!=MONOCULAR && mSensor!=IMU_MONOCULAR
        // 创建并开启闭环线程
        mpLoopCloser = new LoopClosing(mpAtlas, mpKeyFrameDatabase, mpVocabulary, mSensor!=MONOCULAR, activeLC); // mSensor!=MONOCULAR);
        mptLoopClosing = new thread(&ORB_SLAM3::LoopClosing::Run, mpLoopCloser);

        //Set pointers between threads
        // 设置线程间的指针
        mpTracker->SetLocalMapper(mpLocalMapper);
        mpTracker->SetLoopClosing(mpLoopCloser);

        mpLocalMapper->SetTracker(mpTracker);
        mpLocalMapper->SetLoopCloser(mpLoopCloser);

        mpLoopCloser->SetTracker(mpTracker);
        mpLoopCloser->SetLocalMapper(mpLocalMapper);

        //usleep(10*1000*1000);

        //Initialize the Viewer thread and launch
        // 创建并开启显示线程
        if(false)
            //if(false) // TODO
        {
            mpViewer = new Viewer(this, mpFrameDrawer,mpMapDrawer,mpTracker,strSettingsFile,settings_);
            mptViewer = new thread(&Viewer::Run, mpViewer);
            mpTracker->SetViewer(mpViewer);
            mpLoopCloser->mpViewer = mpViewer;
            mpViewer->both = mpFrameDrawer->both;
        }
        // Choose to use slam mode
//        if(mode2){
//            ActivateLocalizationMode();
//            mpTracker->mbActivateLocalizationMode = true;
//        } else
        cout << "正常SLAM模式!"  << endl;
        mpTracker->mbActivateLocalizationMode = false;
        // Fix verbosity
        // 打印输出中间的信息,设置为安静模式
        Verbose::SetTh(Verbose::VERBOSITY_QUIET);
    }

        First, construct a pointer to track the thread (see Section 2.1), only this is different from ORB-SLAM3, the rest are the same.

2 Structure of dense mapping thread

2.1 The structure of tracking threads

        The pointer of the dense mapping thread is passed here:mpPointCloudMapping

        mpTracker = new Tracking(this,
                                 mpVocabulary,
                                 mpFrameDrawer,
                                 mpMapDrawer,
                                 mpAtlas,
                                 mpKeyFrameDatabase,
                                 strSettingsFile,
                                 mSensor,
                                 mpPointCloudMapping,
                                 settings_,
                                 strSequence);

        Let's look at the difference in this tracking thread: 

shared_ptr<PointCloudMapping> mpPointCloudMapping;

        It’s just that the pointer of dense mapping is given to the tracking thread, and there is no other difference.

2.2 ImageGrabber::GrabRGBD

        It's very detailed here! The main function is to convert color RGB images and depth images in ROS format into CV::Mat format and then enter the tracking, obtain the tracking results (without optimization) and publish them for subsequent nodes to use.

/**
 * @brief
 * @param msgRGB ROS格式的图像 -- 彩色图像
 * @param msgD ROS格式的图像 -- 深度图像
 */
void ImageGrabber::GrabRGBD(const sensor_msgs::ImageConstPtr& msgRGB,const sensor_msgs::ImageConstPtr& msgD)
{
     1. ROS格式的彩色深度图像转化成cv::Mat的形式
    cv_bridge::CvImageConstPtr cv_ptrRGB;
    try
    {
        cv_ptrRGB = cv_bridge::toCvShare(msgRGB);
    }
    catch (cv_bridge::Exception& e)
    {
        ROS_ERROR("cv_bridge exception: %s", e.what());
        return;
    }

    cv_bridge::CvImageConstPtr cv_ptrD;
    try
    {
        cv_ptrD = cv_bridge::toCvShare(msgD);
    }
    catch (cv_bridge::Exception& e)
    {
        ROS_ERROR("cv_bridge exception: %s", e.what());
        return;
    }

     2.调用追踪线程获得位姿
    geometry_msgs::Pose sensor_pose;
    Sophus::SE3f sophus_Tcw;
    sophus_Tcw = mpSLAM->TrackRGBD(cv_ptrRGB->image,cv_ptrD->image,cv_ptrRGB->header.stamp.toSec());
    cv::Mat Tcw = cv::Mat(4, 4, CV_32F);
    cv::eigen2cv(sophus_Tcw.matrix(), Tcw);

    orb_slam_broadcaster = new tf::TransformBroadcaster;

     3.发布正在追踪的图像(有特征点哪个)
    publish_tracking_img(mpSLAM->GetCurrentFrame(), cv_ptrRGB->header.stamp);

     4.发布正在追踪的地图点
    publish_tracked_points(mpSLAM->GetTrackedMapPoints(), cv_ptrRGB->header.stamp);

     5.发布所有的地图点
    publish_all_points(mpSLAM->GetAllMapPoints(), cv_ptrRGB->header.stamp);
    if (!Tcw.empty())
    {
        cv::Mat Twc =Tcw.inv();
        cv::Mat RWC= Twc.rowRange(0,3).colRange(0,3);
        cv::Mat tWC=  Twc.rowRange(0,3).col(3);
        cv::Mat twc(3,1,CV_32F);
        twc = tWC;
        Eigen::Matrix<double,3,3> eigMat ;
        eigMat <<RWC.at<float>(0,0),RWC.at<float>(0,1),RWC.at<float>(0,2),
                RWC.at<float>(1,0),RWC.at<float>(1,1),RWC.at<float>(1,2),
                RWC.at<float>(2,0),RWC.at<float>(2,1),RWC.at<float>(2,2);
        Eigen::Quaterniond q(eigMat);

        Pose_quat[0] = q.x(); Pose_quat[1] = q.y();
        Pose_quat[2] = q.z(); Pose_quat[3] = q.w();

        Pose_trans[0] = twc.at<float>(0);
        //Pose_trans[1] = twc.at<float>(1);
        Pose_trans[1] = 0;
        Pose_trans[2] = twc.at<float>(2);

         6.sensor_pose存放camera的位姿
        sensor_pose.position.x = twc.at<float>(0);
        sensor_pose.position.y = twc.at<float>(1);
        sensor_pose.position.z = twc.at<float>(2);
        sensor_pose.orientation.x = q.x();
        sensor_pose.orientation.y = q.y();
        sensor_pose.orientation.z = q.z();
        sensor_pose.orientation.w = q.w();

        orb_slam.setOrigin(tf::Vector3(Pose_trans[2], -Pose_trans[0], -Pose_trans[1]));
        orb_slam.setRotation(tf::Quaternion(q.z(), -q.x(), -q.y(), q.w()));
        orb_slam_broadcaster->sendTransform(tf::StampedTransform(orb_slam, ros::Time::now(), "/odom", "/orb_cam_link"));

        Cam_Pose.header.stamp =ros::Time::now();
        //Cam_Pose.header.seq = msgRGB->header.seq;
        Cam_Pose.header.frame_id = "/odom";
        tf::pointTFToMsg(orb_slam.getOrigin(), Cam_Pose.pose.position);
        tf::quaternionTFToMsg(orb_slam.getRotation(), Cam_Pose.pose.orientation);

        Cam_odom.header.stamp = ros::Time::now();
        Cam_odom.header.frame_id = "/odom";
        tf::pointTFToMsg(orb_slam.getOrigin(), Cam_odom.pose.pose.position);
        tf::quaternionTFToMsg(orb_slam.getRotation(), Cam_odom.pose.pose.orientation);
        Cam_odom.pose.covariance = {0.01, 0, 0, 0, 0, 0,
                                    0, 0.01, 0, 0, 0, 0,
                                    0, 0, 0.01, 0, 0, 0,
                                    0, 0, 0, 0.01, 0, 0,
                                    0, 0, 0, 0, 0.01, 0,
                                    0, 0, 0, 0, 0, 0.01};

         7.发布相机单帧位姿
        CamPose_Pub.publish(Cam_Pose);
        Camodom_Pub.publish(Cam_odom);
        std_msgs::Header header ;
        header.stamp =msgRGB->header.stamp;
        header.seq = msgRGB->header.seq;
        header.frame_id="/odom";
        camerapath.header =header;
        camerapath.poses.push_back(Cam_Pose);

         7.发布相机轨迹
        pub_camerapath.publish(camerapath);

    }

}

Guess you like

Origin blog.csdn.net/qq_41694024/article/details/134669040