(opencv) image geometric transformation - affine transformation

The affine transformation of an image refers to the conversion of one two-dimensional coordinate to another two-dimensional coordinate in a space Cartesian coordinate system. Affine transformation is a linear transformation, which can be expressed as a linear transformation (matrix multiplication) and translation process. Affine transformation is mainly used to implement related collection transformation operations such as translation, scaling, flipping, rotation and shearing.

The original image coordinates (x, y) in the space are transformed into (x^, y^) by affine transformation. Common transformations in affine transformation are as follows:

Image affine transformation can be applied to most geometric transformation operations

When using opencv for affine transformation, an affine transformation matrix is ​​first calculated. The getRotationMatrix2D function first converts the angle into radians, and then calculates the rotation matrix.

Mat getRotationMatrix2D(Point2f center, double angle, double scale)
{
	//角度转换
	angle *= CV_PI / 180;
	//计算旋转矩阵角度
	double alpha = cos(angle) * scale;
	double beta = sin(angle) * scale;
	Mat M(2, 3, CV_64F);
	double* m = (double*)M.data;
	//构建旋转矩阵
	m[0] = alpha;
	m[1] = beta;
	m[2] = (1 - alpha) * center.x - beta * center.y;
	m[3] = -beta;
	m[4] = alpha;
	m[5] = beta * center.x + (1 - alpha) * center.y;
	return M;
}

After the affine transformation matrix of the two-dimensional rotation is calculated by it, the affine transformation of the source image can be performed through the warpAffine function. 

Application example (the above getRotationMatrix2D function does not need to be put in when writing the following code)

#include<opencv2/imgproc/imgproc.hpp>
#include<opencv2/core/core.hpp>
#include<opencv2/highgui/highgui.hpp>
#include<iostream>
using namespace std;
using namespace cv;


int main()
{
	Mat src = imread("C:\\Users\\32498\\Pictures\\16.png");
	if (!src.data)
	{
		return -1;
	}
	imshow("src", src);
	//①点仿射
	int nRows = src.rows;
	int nCols = src.cols;
	//定义仿射变换的二维点数组
	//源图像和目标图像对应映射的三个点
	Point2f srcPoint[3];
	Point2f resPoint[3];

	srcPoint[0] = Point2f(0, 0);
	srcPoint[1] = Point2f(nCols - 1, 0);
	srcPoint[2] = Point2f(0, nRows - 1);
	resPoint[0] = Point2f(nCols * 0, nRows * 0.33);
	resPoint[1] = Point2f(nCols * 0.85, nRows * 0.25);
	resPoint[2] = Point2f(nCols * 0.15, nRows * 0.7);

	//定义仿射变换矩阵2*3
	Mat warpMat(Size(2, 3), CV_32F);
	Mat result = Mat::zeros(nRows, nCols, src.type());
	//计算仿射变换矩阵
	warpMat = getAffineTransform(srcPoint, resPoint);
	//根据仿射变换矩阵计算图像仿射变换
	warpAffine(src, result, warpMat, result.size());
	imshow("result", result);
	waitKey();

	//②角仿射
	

	//设置仿射变换参数
	Point2f centerPoint = Point2f(nCols / 2, nRows / 2);
	double angle = -50;
	double scale = 0.7;

	//获取仿射变换矩阵
	warpMat = getRotationMatrix2D(centerPoint, angle, scale);

	//对源图像进行仿射变换
	warpAffine(src, result, warpMat, result.size());
	imshow("result", result);
	waitKey();
	return 0;

}

The results are as follows: Figure 2 is a point affine transformation, and Figure 3 is an angular affine transformation (but part of the information is lost) 

 

Guess you like

Origin blog.csdn.net/yangSHU21/article/details/131166431