OpenCV image affine transformation

The rotation of the OpenCV image is achieved through the affine transformation of the image, and the rotation of the image is divided into three steps:

Step 1: Determine the rotation angle and rotation center.

Step 2: Determine the rotation matrix. Calculated by the getRotationMatrix2D function.

Step 3: Rotate through affine transformation. It is realized by the warpAffine function.

1. getRotationMatrix2D function

prototype

CV_EXPORTS_W Mat getRotationMatrix2D(Point2f center, double angle, double scale);

Function: Calculate a two-dimensional rotation affine matrix.

Make a continuous variable:
[ [ α β ( 1 − α ) ⋅ center.x − β ⋅ center.y − β α β ⋅ center.x + ( 1 − α ) ⋅ center.y ] ] [\begin{bmatrix } \alpha & \beta & (1- \alpha ) \cdot \texttt{center.x} - \beta \cdot \texttt{center.y} \\ - \beta & \alpha & \beta \cdot \texttt{ center.x} + (1- \alpha ) \cdot \texttt{center.y} \end{bmatrix}][[a bba(1a )center.xbcenter.ybcenter.x+(1a )center.y] ]
Properties:
[ α = scale ⋅ cos ⁡ angle , β = scale ⋅ sin ⁡ angle ] [\begin{array}{l} \alpha = \texttt{scale} \cdot \cos \texttt{angle} , \ \ \beta = \texttt{scale} \cdot \sin \texttt{angle} \end{array}][a=scalecosangle,b=scalesinangle]
Transformation mapping The rotation center is to the center itself, if this is not the target, adjust the offset.

Parameter interpretation

  • Parameter center The center of rotation in the source image
  • The parameter angle is the rotation angle represented by the reading. A positive value indicates a counterclockwise rotation (the origin of the coordinates is assumed to be in the upper left corner).
  • Parameters scale Isotropic scale factor for both axes. 1 means no scaling.

Reference getAffineTransform , warpAffine , transform

Two, warpAffine function

prototype:

CV_EXPORTS_W void warpAffine( InputArray src, OutputArray dst,
                              InputArray M, Size dsize,
                              int flags = INTER_LINEAR,
                              int borderMode = BORDER_CONSTANT,
                              const Scalar& borderValue = Scalar());

Function: implement an affine transformation on an image

When the flag WARP_INVERSE_MAP is set, the function warpAffine uses the following matrix to implement an affine transformation:
[ dst ( x , y ) = src ( M 11 x + M 12 y + M 13 , M 21 x + M 22 y + M 23 ) ] [\texttt{dst} (x,y) = \texttt{src} ( \texttt{M} _{11} x + \texttt{M} _{12} y + \texttt{M} _ {13}, \texttt{M} _{21} x + \texttt{M} _{22} y + \texttt{M} _{23})][dst(x,y)=src(M11x+M12y+M13,M21x+M22y+M23) ]
Otherwise, the transformation is first inverted using invertAffineTransform and then substituted for M in the above formula, the function cannot be transformed in-place.

Parameter interpretation

  • Parameter src input image.
  • The parameter dst outputs the image, the size is dszie, and the type is the same as src.
  • Transformation matrix of parameters M 2*3.
  • Parameter dsize output image size.
  • The combination of the parameter flags interpolation method (see InterpolationFlags ) and the optional flag WARP_INVERSE_MAP means that M is an inverse transform.
  • Parameters borderMode pixel extrapolation (see BorderTypes ); when
    borderMode=#BORDER_TRANSPARENT, means that the pixels of the target image correspond to "outliers" in the original image and cannot be modified by the function.
  • The parameter borderValue is the value in fixed edge cases, the default is 0.

参见warpPerspective, resize, remap, getRectSubPix, transform

3. Source code example

1. Rotate

#include <opencv2/opencv.hpp>
#include <iostream>

using namespace cv;
using namespace std;

int main()
{
    
    
	Mat src = imread("D:\\OpenCVtest\\images\\juice.png", 1);
	if (src.empty())
	{
    
    
		cout << "could not load image..." << endl;
		return -1;
	}

	double rotate_angle = 15;                              //设置旋转的角度
	Point2f rotate_center(src.rows / 2, src.cols / 2);     // 设置旋转中心
	Mat rotate_matrix = getRotationMatrix2D(rotate_center, rotate_angle, 1);     //计算旋转矩阵
	Size dstSize(src.cols, src.rows);
	Mat image_dst;
	warpAffine(src, image_dst, rotate_matrix, dstSize );    // 仿射变换
	imshow("源图像", src);
	imshow("仿射变换0", image_dst);

	waitKey();
	return 0;
}

operation result:

insert image description here

2. Three-point mapping

#include <opencv2/opencv.hpp>
#include <iostream>

using namespace cv;
using namespace std;

int main()
{
    
    
	Mat src = imread("D:\\OpenCVtest\\images\\juice.png", 1);
	if (src.empty())
	{
    
    
		cout << "could not load image..." << endl;
		return -1;
	}
	Size dstSize(src.cols, src.rows);
	
	//  定义三个点
	Point2f srcPoints[3];
	Point2f dstPoints[3];
	srcPoints[0] = Point2f(0, 0);
	srcPoints[1] = Point2f(0, (float)(src.cols-1));
	srcPoints[2] = Point2f(float(src.rows-1), (float)(src.cols - 1));

	// 变换后的三个点
	dstPoints[0] = Point2f((float)(src.rows)*0.15, (float)(src.cols)*0.25);
	dstPoints[1] = Point2f((float)(src.rows) * 0.25, (float)(src.cols) * 0.65);
	dstPoints[2] = Point2f((float)(src.rows) * 0.70, (float)(src.cols) * 0.70);

	// 根据对应点求仿射矩阵
	Mat rotate_matrx_1 = getAffineTransform(srcPoints, dstPoints);
	Mat image_warp;
	warpAffine(src, image_warp, rotate_matrx_1, dstSize);
	imshow("源图像", src);
	imshow("三点映射仿射变换", image_warp);

	waitKey();
	return 0;
}

operation result:
insert image description here

Guess you like

Origin blog.csdn.net/jndingxin/article/details/120670489