非线性优化库Ceres的使用2

参考官网​​教程:http://www.ceres-solver.org/nnls_tutorial.html

Hello World!

考虑下面的最优化问题:

第一步:编写一个代价函数

struct CostFunctor {
   template <typename T>

   bool operator()(const T* const x, T* residual) const {

     residual[0] = T(10.0) - x[0];
     return true;
   }

};

operator()是一个模板化的方法,是的输入和输出可以具有不同的类型,这个在《C++ primer plus》中有介绍。

第二步:目前已经有类残差函数,可以构建一个非线性优化问题了。

int main(int argc, char** argv) {
  google::InitGoogleLogging(argv[0]); //这句话是个什么意思,感觉不要也不影响,在高翔的《视觉SLAM十四讲中》没有发现这句话,应该不影响的。

  // The variable to solve for with its initial value.
  double initial_x = 5.0;
  double x = initial_x;

  // Build the problem.
  Problem problem;

  // Set up the only cost function (also known as residual). This uses
  // auto-differentiation to obtain the derivative (jacobian).

  CostFunction* cost_function =
      new AutoDiffCostFunction<CostFunctor, 1, 1>(new CostFunctor);
  problem.AddResidualBlock(cost_function, NULL, &x);//代价函数,核函数,待优化变量

  // Run the solver!
  Solver::Options options;
  options.linear_solver_type = ceres::DENSE_QR;//QR分解
  options.minimizer_progress_to_stdout = true;//输出到cout
  Solver::Summary summary;//优化信息
  Solve(options, &problem, &summary);//开始计算

  std::cout << summary.BriefReport() << "\n";
  std::cout << "x : " << initial_x
            << " -> " << x << "\n";
  return 0;
}

整个代码如下所示:

#include "ceres/ceres.h"
#include "glog/logging.h"
using ceres::AutoDiffCostFunction;
using ceres::CostFunction;
using ceres::Problem;
using ceres::Solver;
using ceres::Solve;
// A templated cost functor that implements the residual r = 10 -
// x. The method operator() is templated so that we can then use an
// automatic differentiation wrapper around it to generate its
// derivatives.

struct CostFunctor {
  template <typename T> bool operator()(const T* const x, T* residual) const {
    residual[0] = 10.0 - x[0];
    return true;
  }
};
int main(int argc, char** argv) {
  google::InitGoogleLogging(argv[0]);
  // The variable to solve for with its initial value. It will be
  // mutated in place by the solver.

  double x = 0.5;
  const double initial_x = x;
  // Build the problem.
  Problem problem;
  // Set up the only cost function (also known as residual). This uses
  // auto-differentiation to obtain the derivative (jacobian).

  CostFunction* cost_function =
      new AutoDiffCostFunction<CostFunctor, 1, 1>(new CostFunctor);
  problem.AddResidualBlock(cost_function, NULL, &x);
  // Run the solver!
  Solver::Options options;
  options.minimizer_progress_to_stdout = true;
  Solver::Summary summary;
  Solve(options, &problem, &summary);
  std::cout << summary.BriefReport() << "\n";
  std::cout << "x : " << initial_x
            << " -> " << x << "\n";
  return 0;

}

CMakeLists.txt文件如下:

cmake_minimum_required(VERSION 2.8)
project(ceres)
#set(CMAKE_MODULE_PATH ${PROJECT_SOURCE_DIR}/cmake_modules)
find_package(Ceres REQUIRED)
include_directories(${CERES_INCLUDE_DIRS})
add_executable(use_ceres main.cpp)
target_link_libraries(use_ceres ${CERES_LIBRARIES})

运行结果如下:


猜你喜欢

转载自blog.csdn.net/qq_27806947/article/details/80228218
今日推荐