ORB_SLAM3: System class constructor review and my own understanding

ORB_SLAM3::System SLAM(argv[1], argv[2], ORB_SLAM3::System::IMU_MONOCULAR, true);    // 其中initFr=0

1. Read the File.version of the yaml configuration file. If there is no such content in the configuration file, set settings_ to nullptr; then read the System.LoadAtlasFromFile and System.LoadAtlasFromFile of the yaml configuration file, and assign the corresponding values ​​to the member variables respectively. mStrLoadAtlasFromFile and mStrLoadAtlasFromFile, activeLC represents whether to activate loopback, which is on by default, but will be determined according to the loopClosing value in the configuration file;

2. Create a new ORB dictionary mpVocabulary, which is to initialize a k-branch tree with a depth of d, which can accommodate k^{d}words, and set the word weight calculation method (TF_IDF) and the difference score calculation method (L1_NORM) between the two image vectors;

3. Read the pre-trained ORB dictionary ORBvoc.txt. It takes a while to read this file. There are optimizations on the Internet that turn the file into a binary file so that it can be read faster. The process is mainly to read the trained dictionary into the containers m_nodes and
m_words. The top node id is 0;

      The first line of the ORBvoc.txt file: Number of branches (k=10) Depth (d=6) Scoring type (0) Weight type (0)

      The second line of the ORBvoc.txt file: whether the parent node id is a leaf (word) node, the descriptor of the node, the weight of the node (only for word)

std::vector<Node*> m_words; 
std::vector<Node> m_nodes;      // Node是一个结构体,包括节点id、权重(weight)、children、parent、描述子(descriptor)和word_id等。

4. Create a keyframe database:

mpKeyFrameDatabase = new KeyFrameDatabase(*mpVocabulary);    // 就是二维容器mvInvertedFile预留出字典中单词的个数,里面的内容在回环检测线程中设置;
std::vector< list<KeyFrame*> > mvInvertedFile;               // 该容器的序号为字典中单词的顺序,list容器对应的为包含该单词的关键帧指针。

      1) When the member variable mStrLoadAtlasFromFile is empty , create an atlas, initialize mpCurrentMap, and set it as the active map (member variable mIsInUse is true). All maps are stored in the container mspMaps, and there is only one active map in the atlas;

      2) When the member variable mStrLoadAtlasFromFile is not empty , load the atlas: bool isRead = LoadAtlas(FileType::BINARY_FILE); and then create the atlas: mpAtlas->CreateNewMap().

5. Create the class mpFrameDrawer for displaying frames, and set the mState of the viewer thread to SYSTEM_NOT_READY, which means that the system is not ready, which is generally the state when the configuration file and dictionary file are loaded after startup. Initialize the left and right eye display images mIm and mImRight to be black images of 480x640. Creating the class mpMapDrawer for displaying maps is to read some parameters corresponding to the Viewer of the yaml configuration file;

6. Create a tracking thread (main thread), which will not be started immediately. It will be executed in the main thread after preprocessing the image and imu. Initialize some parameters, set the tracking mState to NO_IMAGES_YET, read the parameters corresponding to the Camera and ORBextractor of the yaml configuration file, and finally read the IMU parameters.

b_parse_imu = ParseIMUParamFile(fSettings);    // 创建mpImuCalib对象,并设置噪声协方差Cov和随机游走协方差CovWalk。

Finally, create the mpImuPreintegratedFromLastKF object, which is to assign the noise covariance Cov and random walk covariance CovWalk of mpImuCalib to Nga and NgaWalk, initialize the pre-integration parameters (Initialize(b_);), that is, set dR to the unit matrix, and set all other parameters to 0;

7. To create a localmapping thread is to initialize some parameters and then start the LocalMapping thread;

8. To create a LoopClosing thread is to initialize some parameters and then start the LoopClosing thread;

9. Setting pointers between threads means that threads can call each other's variables and functions, which makes it easier to share data;

10. Finally, create the display thread, which is to initialize some parameters and then read the Camera and Viewer related parameters in the configuration file yaml. Turn on the display thread. At this time, 3 threads have been started, and the tracking thread is executed in the main thread.

Guess you like

Origin blog.csdn.net/qq_44530706/article/details/130187541