OpenGL pipeline overview

   The pipeline in opengl looks like this:

                   Read vertex data -> execute vertex shader -> primitive assembly -> rasterize primitives -> execute fragment shader -> per-fragment operations -> framebuffer operations

    1. Read vertex data. Specify the vertices of the primitive through glDrawArrays.glDrawElements.

    2. Execute the vertex shader. For vertex-by-vertex operations in the shader, the work that can be done at this stage is:

        1. Vertex transformation: according to the model view and projection matrix transformation

        2. Lighting calculation and normal transformation (the normal matrix is ​​the 3*3 inverse matrix of the upper left corner of the model matrix) and normal normalization

        3. Texture coordinate transformation. (texture matrix)

        4. Material Status: Texture Coordinate Generation

This stage receives the original data of each vertex, and outputs the transformed vertex data.

     3. Element assembly. At this stage, vertex data has been determined, and vertices are assembled into primitives according to primitive rules, such as GL_POINTS, GL_TRIANGLES, etc. In addition, clipping is performed on each primitive and its corresponding vertices (if it is a perspective projection, you need to use the perspective matrix to do mathematical operations, the key is to get the value of W), perspective division, and viewport transformation. There are a lot of knowledge points here, and I will explain them in detail in future articles.

      4. Rasterize primitives. After vertex transformation and primitive clipping, the rasterization pipeline takes individual primitives (such as triangles, line segments, points) and generates corresponding fragments for that primitive. Each fragment is directly mapped to a pixel on the screen. As a result, a fragment is corresponds to a pixel. However, this is not always the case, and on ultra-high-resolution devices, it may be possible to use large fragments, each corresponding to several pixels, to reduce the workload on the GPU.

     5. Execute the fragment shader. The main task at this stage is to specify a fragment's color and depth values. Divided into the following situations:

           1. Texture color. Get the corresponding color from the sampler through texture coordinates.

           2. Atomize color. Calculates the fog color from the fragment and the current viewpoint position.

           3. Color summary.. This is a completely different concept from mixing. The texture, the color defined by the main, the color of the fog, and the color calculated by the lighting color stage are summarized together.

         In addition to the three points above, you can perform two functions that are not setting the color of the fragment: clipping plane (glScissor), Alpha test (glAlphaFunc)

      6. Operate piece by piece. This stage occurs after the fragment shader. The output of the fragment shader is the color and depth values. A series of operations are performed in the fragment-by-fragment operation stage, which can be applied to the frame buffer. This series of operations is as follows:

           1. Crop area test (specify that only pixels in a rectangular area can be modified)

           2. Stencil buffer test

           3. Depth Buffer Test

           4. Multiple sampling 

           5. Mix

           6. Jitter

    7. Frame buffer operations. Before actually rendering to the interface, the rendered content will be rendered into the buffer first. The system provides a default buffer, but to achieve some special effects, it is impossible to use the default buffer, so we need to create another buffer, draw the texture in this buffer, and then draw the texture to the default buffer, then the default buffer will be displayed on the screen.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325459287&siteId=291194637