Unity-Shader- render queue

Rendering Introduction

In the rendering phase, the work done by the engine is to render all objects in the scene according to a certain strategy (order). The first is the painter's algorithm, by definition, is like a painter painting, like the first painting behind the object, if there is an object in front, then use an object in front of the cover behind objects, but in this way because the arrangement is for the object to sort , and between the object may overlap, so the effect is not good. So now more common way is z-buffer algorithm, similar to the color of the color buffer buffer, z-buffer is currently stored in the depth information for each pixel a depth value is stored, so that, for each pixel displayed on the screen of our point will be the depth ordering, we can ensure the occlusion relationships drawn is correct. Controlling the z-buffer is performed by ZTest, and ZWrite. But sometimes the need for more precise control of the rendering order of objects of different types, so there is a render queue. Today to learn about the render queue, ZTest, the basic use ZWrite and analyze Unity for Early-Z made some optimization.

Several Unity in the render queue

First look at several built Unity render queue in accordance with the rendering order, oldest first sort **, a smaller number of queues, the first rendering, the larger the number of the queue, after the rendering.
**

Background (1000) the first queue of the object to be rendered.
Geometry (2000) render queue of opaque objects. Most of the queue object should be used for rendering, which is the default Unity Shader render queue.
AlphaTest (2450) a transparent channel, the need for the queue object Alpha Test, and more effective than in the Geomerty.
Transparent (3000) a semipermeable render queue object. The depth is generally not written objects, Alpha Blend rendering the like in the queue.
Overlay (4000) of the last rendered object queue, generally covering effect, such as lens flare, and the like of the screen patches.

Unity Set render queue is also very simple, we do not need to manually create, do not need to write any scripts, only need to add a Tag in the shader on it, of course, if not, name is the default render queue the Geometry . For example, we need our object rendered in Transparent this render queue, then it can be written:
Tags { "Queue" = "Transparent"}
, we can see shader to render queue directly on the shader Inspector window:
Here Insert Picture Description
Further, we when writing the shader also often have a Tag called RenderType, but this is not so common Render Queue, by the way record it here:

Opaque: shader for most (normal shaders, self-luminous shaders, a shader reflector and terrain shaders).
Transparent: a shader translucent (transparent shader, particle shaders, a shader font, additional terrain shader channel).
TransparentCutout: Transparent skin shader (Transparent Cutout, vegetation shader two channels).
Background: Sky cartridge shader.
Overlay: GUITexture, lens flare, screen flickering etc. effects shaders used.
TreeOpaque: terrain engine bark.
TreeTransparentCutout: terrain engine leaves.
TreeBillboard: Billboard tree terrain engine.
Grass: Grass terrain engine.
GrassBillboard: Billboard grass terrain engine.

Rendering the same opaque queue object rendering order

In Unity, create three cubes, you are using the default bump diffuse shader (rendering the same queue), respectively to three different materials (engine dynamic objects together a small number of vertices of the same batch of material will be), with Unity with the Frame Debug tools look DrawCall.
Here Insert Picture Description
As can be seen, the Unity for opaque objects, is the use of rendered from front to back is rendered , so that, during the opaque objects vertex End stage, Z Test, then the object can be obtained if the end is visible on the screen , and if the front finished rendering objects have written the depth, the depth test fails, then the rendering of objects behind will not go carry out fragment stage. (But here need to put a little distance between the three objects opened some I found in the test, especially if the distance is close, rendering order chaotic situation will be, because we do not know what is in accordance with the internal distance sort Unity criteria to determine which objects from the camera closer, here I would not guess)

Rendering the same queue in the order of translucent objects rendered

Transparent objects have been rendered graphics with relatively boring place, for rendering transparent objects can not be rendered as opaque objects that do more with less, because the depth of transparent objects can not write, that is interspersed between transparent objects there is no way to determine the relationship, so translucent object rendering are generally used when rendering method from the forward , because the more transparent objects, transparent objects do not write depth, there is no so-called transparent objects between name can be removed by depth test optimization, each pixel rendered transparent objects will take the stage, it will cause a lot of over Draw. This is the particle effects for performance reasons in particular cost.

We experiment a bit Unity in order to render translucent objects, or the top three cubes, we put the material into a unified shader particles most commonly used Particle / Additive type of shader, then look FrameDebug tool order rendered:
Here Insert Picture Description

Custom Render Queue

Unity支持我们自定义渲染队列,比如我们需要保证某种类型的对象需要在其他类型的对象渲染之后再渲染,就可以通过自定义队列进行渲染。而且超级方便,我们只需要在写shader的时候修改一下渲染队列中的Tag即可。比如我们希望我们的物体要在所有默认的不透明物体渲染完之后渲染,name我们就可以使用
Tag{“Queue” = “Geometry+1”}就可以让使用了这个shader的物体在这个队列中进行渲染。
还是上面的三个立方体,这次我们分别给三个不同的shader,并且渲染队列不同,通过上面的实验我们知道,默认情况下,不透明物体都是在Geometry这个队列中进行渲染的,那么不透明的三个物体就会按照cube1,cube2,cube3进行渲染。这次我们希望将渲染的顺序反过来,那么我们就可以让cube1的渲染队列最大,cube3的渲染队列最小。贴出其中一个的shader:

Shader "Custom/RenderQueue1" {
 
	SubShader
	{
		Tags { "RenderType"="Opaque" "Queue" = "Geometry+1"}
	
		Pass
		{
			CGPROGRAM
			#pragma vertex vert
			#pragma fragment frag
			#include "UnityCG.cginc"
			struct v2f
			{
				float4 pos : SV_POSITION;
			};
 
			v2f vert(appdata_base v)
			{
				v2f o;
				o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
				return o;
			}
 
			fixed4 frag(v2f i) : SV_Target
			{
				return fixed4(0,0,1,1);
			}
			ENDCG
		}
	}
	//FallBack "Diffuse"
}

这里我用ASE制作的Shader跟上述是一致的。

其他的两个shader类似,只是渲染队列和输出颜色不同。

Here Insert Picture Description
通过渲染队列,我们就可以自由地控制使用该shader的物体在什么时机渲染。比如某个不透明的像素阶段操作较费,我们就可以控制它的渲染队列,让其渲染更靠后,这样可以通过其他不透明物体写入的深度剔除该物体所占的一些像素。

注意:我们在修改shader的时候一般不需要什么其他操作就可以直接看到修改后的变化,但是改完渲染队列后,有时候会出现从shader的文件上能看到渲染队列的变化,但是从渲染结果以及Frame Debug工具中并没有看到渲染结果的变化,重启Unity也没有起到作用,直到我们把shader重新赋值给材质之后,变化才起了效果。

发布了62 篇原创文章 · 获赞 5 · 访问量 3897

Guess you like

Origin blog.csdn.net/qq_42194657/article/details/104227728