OpenGL rendering pipeline and mutual transformation between 2D to 3D coordinates

Recent project needs, understand the OpenGL pipeline and the transformation between two-dimensional and three-dimensional coordinates, summarize here

render pipeline

The OpenGL rendering pipeline is the most basic and most important thing in the entire OpenGL learning process. Although I have written OpenGL for a while, most of the time I am writing fragment shaders and vertex shaders, and I have not summarized the overall pipeline. Summarize here. Note: The pipeline summary here is based on programmable pipelines.

  1. Vertex Input
    Every OpenGL program, the first step is always to read in the model data, including vertex coordinates, texture coordinates, normal coordinates, and material texture information. After reading the vertex data, allocate space on the graphics card, use glBindBufferto transfer the vertex data to the graphics card, and use glVertexAttribPointerand glEnableVertexAttribArraycontrol the memory arrangement.
  2. Vertex Shading
    Vertex shading is the first stage of the graphics card computation, which is the vertex shader we write. In the vertex shading stage, we need to transform the position of the vertex to get the correct position. For example, the model is transformed to world coordinates, the view is transformed to camera coordinates, and the projection is transformed to projected coordinates. In the fixed pipeline, here we only need to set the corresponding matrix to complete the transformation automatically. The output of the vertex shader is the gl_Positionfinal position of the output vertex.
  3. Subdivision Shading / Geometry Shading
    This stage is to run the subdivision and geometry shaders, perform further transformations on vertices, create new primitives, subdivide primitives, etc. The author has not written these two shaders, and I don't know much about them. See "OpenGL Programming Guide"
  4. Primitive assembly
    After the vertex shading stage, we get vertices, but what we actually render is triangular patches, so we need to assemble them according to the form of primitives, such as assembling into triangles, lines or points.
  5. Trimming
    After the projection transformation, the model is transformed to the NDCmedium [standard coordinate system, the visible range is a [-1,1]cube box], and the part outside the box will not be visible and needs to be cut and discarded.
  6. Rasterization
    Rasterization is the most important part of rasterization rendering. It can be understood as a grid, which determines which pixel grid is covered according to the previous primitives. During the rasterization process, the data output by the vertex shader is interpolated, including coordinates, normals, etc. After rasterization, it has been transformed to screen coordinates.
  7. Fragment Shading
    Fragment shading computes the final shading for each fragment. After rasterization, many fragments can be obtained [Note: One pixel can correspond to multiple fragments]. We get the data from the vertex shader [the data is rasterized and interpolated], such as vertex coordinates, normal coordinates, texture coordinates, etc., and calculate the final shading of each fragment.
  8. Post-processing stage The
    post processing stage includes depth testing, stencil testing, mixing and other components.
    • Depth test: Each fragment has a depth value, but only one fragment per pixel can be displayed. The depth buffer will retain the fragment with the smallest depth, that is, discard the fragments that cannot be seen later, and only retain the most recent ones.
    • Template test: Perform template operation and update according to the set template, and discard the fragments that fail the test.
    • Mixing: Mixing mainly renders transparent objects. When calculating the color buffer, not only the nearest fragment is considered in the depth test, but the cumulative characteristics between each fragment will be considered. Therefore, there is a certain contradiction between mixing and depth testing. Pay attention to usage skills.

The picture comes from the Internet, and the infringement is deleted
write picture description here

Coordinate transformation

Transformation from 3D coordinates to 2D coordinates

3D to 2D is the forward process of rasterization rendering. Note that it will be updated every timep

  1. Enter local coordinates, convert homogeneous coordinates:p = glm::vec4(p_input,1.0)
  2. Model transformation, world coordinates:p = model*p
  3. View transformation, camera space:p = view*p
  4. Projection transformation, projection space:p = project*p
  5. Perspective division, removing homogeneous coordinates,p/=p.w
  6. Crop, keep only the range that is [-1,1]between
  7. Viewport Transform, will x,ytransform to w,h. The range is [-1,1]to [0,0]->[w,h]. Pair ztransform, from [-1,1]transform to[0,1]

Transformation from 2D coordinates to 3D coordinates

2D to 3D is the inverse process of rasterization rendering, which is used to restore the 3D position corresponding to each pixel of the rendering

  1. To get screen coordinates [x,y], use glReadPixelsthe depth to read that position z. Note that ythe direction needs to be reversed, with the OpenGLorigin in the lower left corner.
  2. Transform to NDC, transforms [x,y]the screen coordinates of to [-1,1], transforms the zvalues ​​from [0,1]to [-1,1].
  3. Add homogeneous coordinates, p = glm::vec4(x,y,z,1.0).
  4. Perform the inverse transformation, which p,v,mrepresents the projection, view, model transformation matrices, respectively,p=glm::inverse(p*v*m)*p
  5. Starting from perspective, remove the homogeneous coordinates, p/=p.w, and obtain the original three-dimensional coordinates

Note:
1. The transformations here are all based on programmable pipelines, that is, the model-view projection transformation is done by itself, and it is only necessary to extract the corresponding transformation matrix.
2. If it is a fixed pipeline, you need to use glGetDoublevto obtain the corresponding matrix, and then use gluUnProjectit for inverse transformation. This is gluta function in the library.

If there are any mistakes, please correct me~

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325775089&siteId=291194637