When unity is used for process simulation and other high requirements for model rendering, it is often found that the edges of the model are rendered jagged. So where did the sawtooth come from?
The essential reason is that today's display screens are composed of pixel arrays. The shapes of objects in the real world are continuous, and the screen can only be characterized by discrete pixels. For example, a diagonal line on the screen is composed of a row of stepped pixels, which looks jagged.
The following are some mainstream solutions to anti-aliasing problems:
solution
1. Quality setting of unity
The easiest way is to go to the edit menu-find project setting -> Quality, find Soft Particles in the pop-up panel, and choose 4x or higher.
After setting this property, the aliasing phenomenon will be reduced, and the corresponding performance overhead will also increase .
Two, change the rendering path of the camera
Hang a de-aliasing system script for the camera, and set the camera's Rendering Path to ForwardBase, because Unity's default deferred rendering (Deffered Rendering) does not support multi-sample anti-aliasing (MSAA). There are hardware requirements for the graphics card .
After importing, there is an additional package under the Assets resource
Find this script and hang it on the camera.
Three, reduce model overlap surface, wear surface
This method requires an artist to do it, so I won’t introduce it in detail.
Fourth, change the shader rendering method
Haven't tried it yet, add bulabulabula later. . .