Machine Learning Notes - Determinant

I. Overview

        A matrix can be thought of as a linear transformation of space. The determinant of matrix A is a number corresponding to the multiplicative change obtained when transforming space using this matrix. A negative determinant means that the orientation has changed (rather than just rescaling and/or rotating). A change in orientation means, for example, in a 2D plane, mirroring.

        Here is an example that differentiates between positive and negative determinants:

The determinant of a matrix describes the information about the transformation associated with the matrix

        It can be seen that the transformation corresponding to the negative determinant cannot be obtained by rotation and scaling.

        Also, the determinant gives you the amount of transformation. If you take an n-dimensional unit cube and apply matrix A on it, the absolute value of the determinant corresponds to the area of ​​the transformed graph.

Two, example 1

        To calculate the area of ​​a shape, we will use a simple square in two dimensions. The unit square area can be calculated by taking two unit vectors using the Pythagorean theorem.

         The lengths of i and j are 1, so the area of ​​the unit square is 1.

        First, let's create a function plotVectors() to plot vectors:

def plotVectors(vecs, cols, alpha=1):
    plt.figure()
    plt.axvline(x=0, color='#A9A9A9', zorder=0)
    plt.axhline(y=0, color='#A9A9A9', zorder=0)

    for i in range(len(vecs)):
        x = np.concatenate([[0,0],vecs[i]])
        plt.quiver([x[0]],
                   [x[1]],
                   [x[2]],
                   [x[3]],
                   angles='xy', scale_units='xy', scale=1, color=cols[i],
                   alpha=alpha)

        Let's start with creating a vector in Python:

orange = '#FF9A13'
blue = '#1190FF'

i = [0, 1]
j = [1, 0]

plotVectors([i, j], [[blue], [orange]])
plt.xlim(-0.5, 3)
plt.ylim(-0.5, 3)
plt.show()

         We apply to i and j. This is a diagonal matrix. So it will rescale our space, but not rotate. More precisely, it will rescale each dimension in the same way, since the diagonal values ​​are the same. Let's create matrix A:

A = np.array([[2, 0], [0, 2]])

new_i = A.dot(i)
new_j = A.dot(j)
plotVectors([new_i, new_j], [['#1190FF'], ['#FF9A13']])
plt.xlim(-0.5, 3)
plt.ylim(-0.5, 3)
plt.show()

         As expected, we can see that the squares corresponding to i and j are not rotated, but the lengths of i and j are doubled.

         We will now compute the determinant of A (details about determinant computation are not covered here):

np.linalg.det(A)
#结果为4

        As you can see, the transformation multiplies the area of ​​the unit square by 4. The new i and the new j have length 2 (so 2⋅2=4).

Three, example 2

        Now let's look at an example of a negative determinant.

        We will transform the unit square with a matrix: , whose determinant is -4:

B = np.array([[-2, 0], [0, 2]])
np.linalg.det(B)
#结果为-4

         draw

new_i_1 = B.dot(i)
new_j_1 = B.dot(j)
plotVectors([new_i_1, new_j_1], [['#1190FF'], ['#FF9A13']])
plt.xlim(-3, 0.5)
plt.ylim(-0.5, 3)
plt.show()

         We can see that matrices with determinants of 2 and -2 modify the area of ​​the unit square in the same way.

         The absolute value of the determinant shows that, as in the first example, the area of ​​the new square is 4 times the area of ​​the unit square. But this time, it's not just a size change, it's a mirror transformation.

        It's not obvious to just use unit vectors, so let's use some random points.         

        we will use the matrix

points = np.array([[1, 3], [2, 2], [3, 1], [4, 7], [5, 4]])

C = np.array([[-1, 0], [0, 1]])
np.linalg.det(C)

         Since the determinant is -1, the area of ​​the space does not change. However, since it is negative, we get a transformation that cannot be obtained by rotation:

newPoints = points.dot(C)

plt.figure()
plt.plot(points[:, 0], points[:, 1])
plt.plot(newPoints[:, 0], newPoints[:, 1])
plt.show()
reversed the image

         We have seen that the determinant of a matrix is ​​a special value that expresses properties about the transformation corresponding to this matrix.

Guess you like

Origin blog.csdn.net/bashendixie5/article/details/124302324