The use of non-linear optimization library Ceres 2

Refer to the official website for tutorials: http://www.ceres-solver.org/nnls_tutorial.html

Hello World!

Consider the following optimization problem:

Step 1: Write a cost function

struct CostFunctor {
   template <typename T>

   bool operator()(const T* const x, T* residual) const {

     residual[0] = T(10.0) - x[0];
     return true;
   }

};

operator() is a templated method, yes input and output can have different types, this is introduced in "C++ primer plus".

Step 2: There is already a residual function of the class, and a nonlinear optimization problem can be constructed.

int main(int argc, char** argv) {
  google::InitGoogleLogging(argv[0]); //What does this sentence mean, it doesn't matter if you don't feel it, in Gao Xiang's "Fourteen Lectures on Visual SLAM" Found this sentence, it should not affect.

 // The variable to solve for with its initial value.
  double initial_x = 5.0;
  double x = initial_x;

  // Build the problem.
  Problem problem;

 // Set up the only cost function (also known as residual). This uses
  // auto-differentiation to obtain the derivative (jacobian).

  CostFunction* cost_function =
      new AutoDiffCostFunction<CostFunctor, 1, 1>(new CostFunctor);
  problem.AddResidualBlock(cost_function, NULL, &x); //cost function, kernel function, to be optimized Variables

  // Run the solver!
  Solver::Options options;
  options.linear_solver_type = ceres::DENSE_QR;//QR分解
  options.minimizer_progress_to_stdout = true;//输出到cout
  Solver::Summary summary;//优化信息
  Solve(options, &problem, &summary);//开始计算

  std::cout << summary.BriefReport() << "\n";
  std::cout << "x : " << initial_x
            << " -> " << x << "\n";
  return 0;
}

The whole code looks like this:

#include "ceres/ceres.h"
#include "glog/logging.h"
using ceres::AutoDiffCostFunction;
using ceres::CostFunction;
using ceres::Problem;
using ceres::Solver;
using ceres::Solve;
// A templated cost functor that implements the residual r = 10 -
// x. The method operator() is templated so that we can then use an
// automatic differentiation wrapper around it to generate its
// derivatives.

struct CostFunctor {
  template <typename T> bool operator()(const T* const x, T* residual) const {
    residual[0] = 10.0 - x[0];
    return true;
  }
};
int main(int argc, char** argv) {
  google::InitGoogleLogging(argv[0]);
  // The variable to solve for with its initial value. It will be
  // mutated in place by the solver.

  double x = 0.5;
  const double initial_x = x;
  // Build the problem.
  Problem problem;
  // Set up the only cost function (also known as residual). This uses
  // auto-differentiation to obtain the derivative (jacobian).

  CostFunction* cost_function =
      new AutoDiffCostFunction<CostFunctor, 1, 1>(new CostFunctor);
  problem.AddResidualBlock(cost_function, NULL, &x);
  // Run the solver!
  Solver::Options options;
  options.minimizer_progress_to_stdout = true;
  Solver::Summary summary;
  Solve(options, &problem, &summary);
  std::cout << summary.BriefReport() << "\n";
  std::cout << "x : " << initial_x
            << " -> " << x << "\n";
  return 0;

}

The CMakeLists.txt file is as follows:

cmake_minimum_required(VERSION 2.8)
project(ceres)
#set(CMAKE_MODULE_PATH ${PROJECT_SOURCE_DIR}/cmake_modules)
find_package(Ceres REQUIRED)
include_directories(${CERES_INCLUDE_DIRS})
add_executable(use_ceres main.cpp)
target_link_libraries(use_ceres ${CERES_LIBRARIES})

The results are as follows:


Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325774241&siteId=291194637