- 3dMax exports a model file containing content: vertex coordinates, normals, edge relations,
- model model
- There is a camera Model in the space, and the camera can be placed at the origin View through transformation
- Transform the model of the three-dimensional space to the projection on the screen
- Rasterization by sampling Rasterization
The next step is shading
1 Shading: Definition Shading definition
The process of introducing light and shade and different colors is called shading.
Definition in this lesson is the process of applying materials to objects2 Blinn-Phong Reflectance Model A simple shading model
Regarding the shading model of Blinn-Phong,
I previously explained its concept in LearnOpenGL study notes—lighting 02: Lighting Basis/Advanced Lighting.
At that time, I also used Teacher Yan’s PPT and The video is used as an explanation material, and a simple implementation is made (point light source as an example),
so I won’t repeat it here3 Shading Frequencies Shading frequency
3.1 Flat Shading
- flat shading
3.2 Vertex coloring
- Vertex Shading
Shading is done once for each vertex, and the color value is interpolated
3.3 Pixel Shading
- Pixel shading
The Phong Shading here refers to a shading frequency. The Blinn-Phong mentioned earlier is a shading model that
calculates the normal from the three vertices of the triangle. The pixels inside the triangle interpolate the normal value. For each Pixel Shading3.4 Comparison
- When the geometry is complex enough, a simple Shading method can also achieve good results
3.5 Vertex normal calculation method
- Vertex normal calculation method
The normal method of internal smooth transition - Barycentric interpolation center of gravity interpolation
normal remember normalization
4 Graphics (Real-time Rendering) Pipeline Real-time rendering pipeline
about this part
The learning of knowledge so far allows us to learn APIs (such as OpenGL, DirectX).
Recommended here: ShaderToy, on this website, we only need to focus on how to write shaders
current real-time rendering
5 Texture Mapping texture mapping
Regarding the content of texture mapping,
I have explained it in LearnOpenGL study notes - Getting Started 05: Texture
- The schematic diagram in the PPT is a good representation of this mapping process
- The design of the texture itself can be seamlessly connected: tilable, a method of designing seamlessly connected textures: Wang Tiling
6 Barycentric Coordinates barycentric coordinates interpolation
- barycentric coordinate interpolation
Calculation Determination Coefficient Using Area
Calculation Determination Coefficient Using Coordinates
- Using the center coordinate interpolation method, the coordinates of the center of gravity before and after projection may change, so it is necessary to calculate the corresponding coordinates of the center of gravity at the corresponding stage for interpolation, and cannot be reused at will
- interpolate
7 Applying Textures Texture application
- The interpolated UV coordinates are calculated on the point, then queried on the texture, and then applied as needed (simple application in the figure, directly assigned)
8 Texture Magnification texture magnification
The texture resolution is too small, and multiple pixels (pixels) are mapped to the same texel (texture pixel)
- Workaround → Interpolation
8.1 Bilinear Interpolation Bilinear interpolation
Bilinear interpolation: horizontal interpolation + vertical interpolation
lerp is the abbreviation of linear Interpolation
The following figure shows the nearest four point interpolation
- Find four points around the target point
The vertical and horizontal distance between the red point and the lower left corner of the four points
8.2 Bicubic Interpolation Bicubic interpolation
Taking 16 surrounding points is also used for vertical and horizontal interpolation.
Use 4 points each time to do three-time interpolation instead of linear difference. Two-way
three-way interpolation requires a lot of calculations, but the effect is good.
About this method, you can refer to this great god. Introduce
the relationship between bicubic interpolation algorithm (bicubic interpolation) and graphics and calculation
methods
9 Problems with oversized textures
- When the texture is too large, one pixel corresponds to multiple texels → Insufficient sampling frequency leads to moiré pattern + aliasing (aliasing)
A pixel on the far ground plane corresponds to a large block of texture, simple point sampling does not work
Solution:
oversampling, increasing the sampling frequency, but too wasteful
Sampling will cause aliasing, so we don't sample, and only get a certain range of average values.
This is a question of point query vs range query
- Point query, give a point, get the value of a point
range query, without sampling, given an area, get the (average) value of the area
9.1 Mipmap multi-level gradient texture
Mipmap multi-level gradual distance texture
can perform fast range query, but it is approximate and only square.
About the content of Mipmap, I explained it
in the part of LearnOpenGL Study Notes - Getting Started 05: Texture Surrounding Method/Texture Filtering in Texture
- The following is the schematic diagram in the PPT
Schematic diagram of calculating Mipmap
First, we need to find out the area covered by pixels and make an approximation
When considering a pixel, also consider the surrounding points
, and then map their centers to the UV to calculate the distance
, so you can use a square frame with a dotted line whose distance is the side length to approximate the irregular quadrilateral
layer D. The distance between the pixel mapping uv takes the logarithm of 2
as an example.
If the size of the area is 1×1 D=0, then find it on the Mipmap of the most original level.
If the size of the area is 4×4 D=2, then Find it on the Mipmap of the second level, that is, on the second level, this 4×4 will become 1×1, and then go to the second level to check the value of this pixel, that is the average value of this area
Simply between the integer layers, there will be discontinuous gaps,
so if it is not an integer layer, then check the two layers before and after, perform bilinear interpolation, and then perform another interpolation,
that is, trilinear interpolation, and get the result
9.2 Anisotropic Filtering Anisotropy filter
Mipmap will blur the details in the distance
because it can only query the area of the box, and interpolation is only approximate after all
-
The solution to trilinear interpolation is anisotropic filtering
The approximation of the square is sometimes too reluctant. After the texture mapping in the figure below, the odd-shaped
Mipmap is made of a square, that is, the picture in the upper left corner of the picture below, but the compression that is not square can not be done.Anisotropic filtering allows us to perform range queries on long regions, but not on oblique regions. The
overhead of generating anisotropic filtered graphs (Ripmaps) is three times that of the original
. Anisotropy means that in different Its performance is different in the direction.
The X of anisotropy is the compression several times, that is, how many layers are added from the upper left corner to the lower right cornerEWA filtering, splitting any irregular shape into many different circles to cover this shape.
Multiple queries can naturally cover it, but it takes a lot of time10 Application of Texture Various texture maps
In the previous content, we can get, give a grid, and then do various shadingNext, on the basis of shading, we can paste various textures
There are many types of texture maps
10.1 Environment Map ambient light map
Record the light from all directions. Assuming that the ambient light comes from infinity, only the direction is recorded.
That is to say, the texture can be used to represent the ambient light.
Spherical Environment Map Spherical environment mapping
can be compared to a globe
with stretching and distortionCube Map Cube map
records the ambient light on the surface of the cube.
There are also problems. Before the sphere can easily get the light in a certain direction, but now it is necessary to judge which side of the cube this direction is recorded on.10.2 Bump Mapping
The texture of Bump Mapping records the height movement without changing the geometric information. The
texture makes artificial false normals, thus obtaining false coloring effects and producing bump effects.
Through the user-defined height difference, the perturbation method pixel by pixel Line direction and normal direction change, the coloring result will changeCalculate the normal direction: Find the vertical direction after finding the tangent
Normal algorithm in UV case
10.3 Displacement mapping The displacement map
is the same as the input of the bump map, but the displacement map really changes the geometric information, and the displacement of the vertices is
more realistic in comparison, because the bump map will reveal the stuffing on the boundary. The displacement
map requires the triangles of the model to be detailed enough, and the amount of calculation Higher
DirectX has a Dynamic interpolation method, interpolating the model according to the need, and making the model more detailed depending on the situation10.4 Procedural textures Procedural textures
The three-dimensional texture defines the value of any point in the space.
For this kind of texture, there is no real image of this texture.
They define the noise function of the three-dimensional space. After various processing, the function can become what it needs
10.5 Precomputed Shading Precomputed Shading
Use space to calculate time, first calculate the ambient light occlusion map, and then paste the texture
10.6 Solid modeling & Volume rendering
Three-dimensional textures are widely used in volume rendering
, such as MRI scans, to obtain volume information, and use these information to render and obtain results