"Introduction to Unity Shader Essentials" Reading Notes 1

Most of the illustrations are from the book

rendering pipeline

insert image description here

  • Application stage: CPU, preparation work Set the rendering state through the CPU to guide the GPU how to render in the future
  • Geometry stage: GPU, decide what primitives need to be drawn, and how to draw them
  • Rasterization stage: GPU, which produces the pixels that are finally displayed on the screen

application stage

The application stage is the starting point of the rendering pipeline, which is mainly divided into three sub-stages:

  1. Load data into video memory (VRAM)
  2. set render state
  3. Call Draw Call

Data loading: HDD->RAM->VRAM
loaded data: vertex position information, normal direction, vertex color, texture coordinates, etc.

Set the rendering state: define how the mesh is rendered, including which vertex shader/fragment shader to use, light source properties, materials, etc.

After doing the above preparations, the CPU needs to call a rendering command to tell the GPU to render

Draw Call: A command with the CPU as the initiator and the GPU as the receiver. This command will only contain a list pointing to the primitives that need to be rendered (material information and other rendering details have been set up)

After receiving the Draw Call, the GPU performs calculations based on the rendering state and the input vertex data, and finally generates pixels on the screen. This process is the GPU pipeline

geometry stage

The two stages after the application stage, namely the geometry stage and the rasterization stage, the developer does not have absolute control, but the GPU opens up a lot of control to the developer

insert image description here
The colors in the above figure indicate the configurability or programmability of different stages: green indicates that the pipeline stage is fully programmable, yellow indicates that the pipeline stage is configurable but not programmable, and blue indicates that the pipeline stage is controlled by the GPU. Fixed implementation, developers do not have any control. The solid line indicates that the Shader must be programmed by the developer, and the dotted line indicates that the Shader is optional.

For homogeneous clipping space, refer to this article: Computer Graphics Supplement 2: Homogeneous Space Clipping

insert image description here

Normalized device coordinates: Normalized Device Coordinates, NDC

Note: The Z component range in OpenGL and Unity's NDC [ − 1 , 1 ] [-1,1][1,1 ] , while the Z component of DirectX's NDC ranges from[0, 1] to [0,1][0,1]

cut out

There are three cases of clipping:

  • completely out of view
  • completely in view
  • partly in view

For example, a line segment (consisting of two vertices) if one is outside the view and the other is inside the view, the vertex outside the view should be replaced by the intersection point of the line and the view boundary as a new vertex

insert image description here

screen mapping

Convert the x and y (not including z) of each primitive to the screen coordinate system.

The screen coordinate system and the z coordinate constitute the window coordinate system, and these values ​​are passed to the rasterization stage for use.

insert image description here

rasterization stage

Two most important goals:

  • Calculate which pixels are covered by the primitive
  • Calculate the color for these pixels

triangle traversal

Checks whether each pixel is covered by a triangle mesh, and generates a fragment if so . This process of finding which pixels are covered by the triangle mesh is triangle traversal, and this stage is also called scan transformation .

Fragment shader

Also known as a pixel shader in DirectX.

The most important thing is to do texture sampling

The limitation is that it can only affect a single fragment. The exception is that fragment shaders have access to derivative information.

Fragment-by-Piece Operations

is the last step in the rendering pipeline and is highly configurable

insert image description here

The main work of piece-by-piece operation:

  • Determine the visibility of each fragment, which involves a lot of testing work, such as depth testing and stencil testing
  • If a fragment passes all tests, the color value of this fragment needs to be merged with the color value already stored in the color buffer, or mixed

The following figure shows the flow of the two tests

insert image description here

Fragment-by-fragment operations may discard a fragment because it does not pass the test, that is, the work done for this fragment is in vain

Advanced usage of stencil testing: rendering shadows, outline rendering

Depth test: similar to the function of Zbuffer. The technology of executing depth test in advance is called Early-Z technology

Opaque objects do not require blending, but translucent objects do. Mixing is very similar to the operation of PS on layers, such as multiplying and bottoming.

insert image description here

Testing ahead of time (if possible) can avoid having to spend time on the fragments to be discarded, so that the fragment shader does not need to calculate their colors, saving resources. That is, "know which fragments are to be discarded as early as possible"

Use a double buffering strategy to avoid seeing primitives being rasterized.

About OpenGL and DirectX

Both are image application programming interfaces, which require the graphics card driver to translate into a language that the GPU can understand. The graphics card driver is also responsible for converting data such as textures into a format supported by the GPU, which can be said to be the operating system of the GPU.

The whole process is: the application sends rendering commands to the interface, the interface sends rendering commands to the graphics card driver, and the graphics card driver translates them to the GPU

First transfer the configured data to the video memory, and then call Draw Call.

insert image description here

shader language

DirectX用HLSL(High Level Shading Language)
OpenGL用GLSL(OpenGL Shading Language)
NVIDIA用CG(C for Graphic)

These languages ​​will be compiled into assembly language, also known as intermediate language (Intermediate Language, IL), and then translated into a real machine language that the GPU can understand through the graphics card driver.

The advantage of GLSL lies in its cross-platform nature, but since OpenGL is only an interface, its implementation varies, which also means that the compilation result of GLSL will depend on the hardware supplier.

HLSL is Microsoft's, even if the hardware is different, the compilation result of the same shader is the same (provided the version is the same)

CG language can be seamlessly transplanted into HLSL code. But the disadvantage is that it may not be able to fully utilize the latest features of OpenGL.

In short:

  • GLSL is cross-platform but depends on hardware implementation
  • HLSL does not rely on hardware implementation, but most of the platforms that support it are Microsoft's own products
  • CG is truly cross-platform

Draw Call

  • The glDrawElements command in OpenGL
  • DrawIndexedPrimitive command in DirectX

A common misunderstanding is that the culprit of performance problems in Draw Call is the GPU. It is believed that the state switching on the GPU is time-consuming, but it is not. The real "delay" is actually the CPU.

Submitting a Draw Call requires a lot of preparatory work for the CPU, such as checking the rendering status. Once the CPU completes these preparations, the GPU can start this rendering. If the number of Draw Calls is too large, the CPU will spend a lot of time submitting Draw Calls, resulting in CPU overload. There are 10 preparations for 10 Draw Calls, but only 1 preparation is required for merging into one Draw Call (the total amount of processing remains unchanged)

So one of the ways to reduce Draw Call isMerge Draw Call, which is a kind of batch processing idea, and is more suitable for static objects.

Merging grids is one of the things to do, provided that these grids havethe same rendering state

In the game development process, in order to reduce the overhead of Draw Call, there are two points to note:

  1. Avoid using lots of very small meshes. When it is unavoidable to use very small grid structures, consider whether they can be combined.
  2. Avoid using too many materials. Try to share the same material between different meshes.

Parallelism of CPU and GPU

Use the command buffer (Command Buffer) to avoid the wait caused by the speed difference between the two parties, which is also reflected in the principle of computer composition.

insert image description here
There are many types of commands in the command buffer, and Draw Call is one of them. Other commands include changing the rendering state, etc. (such as changing the shader used, using different textures, etc.)

About Fixed Render Pipeline

Fixed-function pipeline (Fixed-Function Pipeline), also referred to as fixed pipeline, usually refers to the rendering pipeline implemented on older GPUs. This kind of pipeline only provides developers with some configuration operations, but developers do not have full control over the pipeline stages.

A vivid metaphor is that when we use a fixed pipeline for rendering, it is like controlling multiple switches on the circuit. We can choose to turn on or off a switch, but we can never control the layout of the entire circuit.

3D API The last version that supports fixed pipelines The first version to support programmable pipelines
OpenGL 1.5 2.0
OpenGL EN 1.1 2.0
DirectX 7.0 8.0

more details

Book: Real Time Rendering

insert image description here

OpenGL:Rendering Pipeline Overview

DirectX: [MSDN] Graphics Pipeline

Unity Shader

Unity Shader is essentially a text file

The stage where the Shader is located is part of the rendering pipeline and is also a feature of the rendering pipeline. Relying on shaders we can control the rendering details in the pipeline.

Unity, as an excellent editing tool, provides us with a place where we can easily write shaders and set the rendering state at the same time: Unity Shader

Unity Shader is very different from the Shader of the rendering pipeline mentioned above

Materials and Shaders complement each other, and materials should be used in combination with Shaders. One of the most common processes is:

  1. create a material;
  2. Create a Unity Shader and assign it to the material created in the previous step;
  3. Assign the material to the object to be rendered;
  4. Adjust the properties of Unity Shader in the material panel to get satisfactory results.

insert image description here

Unity provides a total of 4 Unity Shader templates for us to choose from:

  • Standard Surface Shader will generate a surface shader template that includes the standard lighting model (using the newly added physically-based rendering method in Unity 5)
  • Unlit Shader will produce a basic vertex/fragment shader that does not include lighting (but includes fog effects)
  • Image Effect Shader provides a basic template for us to achieve various screen post-processing effects
  • Compute Shader will generate a special Shader file, this type of Shader is designed to use the parallelism of the GPU to perform some calculations that are not related to the general rendering pipeline

Compute Shader is outside the scope of this book, but you can refer to the introduction of the official document: Compute Shader

Since this book focuses on how to write vertex/fragment shaders in Unity, in subsequent studies, we usually useUnlit Shaderto generate a basic vertex/fragment shader template.

insert image description here
Select a Unity Shader in the Project view to see the Import Settings panel of the Unity Shader

The default texture used by the Unity Shader can be specified in Default Maps. When any material uses the Unity Shader for the first time, these textures will be automatically assigned to the corresponding properties.

In the lower panel, Unity will display information related to the Unity Shader, such as whether it is a Surface Shader (Surface Shader), whether it is a Fixed Function Shader (Fixed Function Shader), etc., and some information is It is related to our label settings in Unity Shader, such as whether to cast shadows, the rendering queue used, LOD values, etc.

Shader Lab

In Unity, all Unity Shaders are written using ShaderLab.

Shader Lab is a way to write Unity Shader provided by Unitydescriptive language. It uses some semantics (syntax) nested inside curly braces to describe the structure of a Unity Shader file. These structures contain a lot of data required for rendering. For example, the various properties required by the shader are defined in the Properties statement block, and these properties will appear in the material panel.

By design, ShaderLab is similar to the CgFX and Direct3D Effects (.FX) languages, both of which define everything needed to display a material, not just shader code.

Shader "ShaderName" {
    
    
    Properties {
    
    
        // 属性 
    }
    SubShader {
    
    
        // 显卡 A 使用的子着色器 
    }
    SubShader {
    
    
        // 显卡 B 使用的子着色器 
    }    
    Fallback "VertexLi"
}

Shader naming

The above ShaderNamelocation can be replaced with the name you specify, for example

Shader "Unlit/partice1"{
    
    }

In this way, in a material configuration panel, when Shader is selected for it, it can be found in the Unlit submenu of the drop-down menu.

insert image description here

Shader properties

Properties are the bridge between material and Unity Shader

The Properties semantic block contains a series of properties (properties), which will appear in the material panel.

The basic definition of properties is:

Properties {
    
    
    Name ("display name", PropertyType) = DefaultValue
    Name ("display name", PropertyType) = DefaultValue
    // 更多属性 
}

The developers declare these properties for easy adjustment of various material properties in the material panel.

If we need to access them in Shader, we need to use each property'sname

In Unity, the names of these properties usually start with an underscore. The display name is the name that appears on the material panel.

We need to specify its type (PropertyType) for each property, and the common property types are shown in the table below. In addition, we also need to specify a default value for each property. When we assign the Unity Shader to a material for the first time, these default values ​​are displayed on the material panel.

attribute type Syntax for defining a default value example
Int number _Int (“Int”, Int) = 2
Float number _Float (“Float”, Float) = 1.5
Range(min, max) number _Range(“Range”, Range(0.0, 5.0)) = 3.0
Color (number,number,number,number) _Color (“Color”, Color) = (1,1,1,1)
Vector (number,number,number,number) _Vector (“Vector”, Vector) = (2, 3, 6, 1)
2d “defaulttexture” {} _2D (“2D”, 2D) = “” {}
Cube “defaulttexture” {} _Cube (“Cube”, Cube) = “white” {}
3D “defaulttexture” {} _3D (“3D”, 3D) = “black” {}

For properties of numeric types such as Int, Float, and Range, the default value is a single number

For properties such as Color and Vector, the default value is a four-dimensional vector surrounded by parentheses

For the three texture types of 2D, Cube, and 3D, the definition of the default value is slightly complicated. Their default value is specified by a string followed by a curly brace, where the string is either empty or a built-in texture A name such as "white", "black", "gray" or "bump".

The curly braces were originally used to specify some texture properties. For example, in versions prior to Unity 5.0, we can control the generation of texture coordinates for fixed pipelines through options such as TexGen CubeReflect. TexGen CubeNormalBut in versions after Unity 5.0, these options are removed. If we need similar functions, we need to write the code to calculate the corresponding texture coordinates in the vertex shader.

Here is an example showing how all properties are defined

Shader "Custom/ShaderLabProperties" {
    
    
    Properties {
    
    
    	_MainTex ("Texture", 2D) = "white" {
    
    }
        // Numbers and Sliders
        _Int ("Int", Int) = 2
        _Float ("Float", Float) = 1.5
        _Range("Range", Range(0.0, 5.0)) = 3.0
        // Colors and Vectors
        _Color ("Color", Color) = (1,1,1,1)
        _Vector ("Vector", Vector) = (2, 3, 6, 1)
        // Textures
        _2D ("2D", 2D) = "" {
    
    }
        _Cube ("Cube", Cube) = "white" {
    
    }
        _3D ("3D", 3D) = "black" {
    
    }
    }

    FallBack "Diffuse"
}

After adding the properties of the above example in the shader, you can see it directly in the material that uses the shader:

insert image description here

The role of the Properties semantic block is only to allow these properties to appear in the material panel.

Sometimes, we want to display more types of variables on the material panel, such as using Boolean variables to control which calculation is used in the Shader. Unity allows us to override the default material editing panel to provide more custom data types.

Refer to the official manual: ShaderLab: Specifying a custom editor

SubShader

Each Unity Shader file can contain multiple SubShader semantic blocks, but there must be at least one. When Unity needs to load this Unity Shader, Unity will scan all SubShader semantic blocks, and then select the first SubShader that can run on the target platform. If none are supported, Unity will use the Unity Shader specified by Fallback semantics.

The reason Unity provides this semantics is that different graphics cards have different capabilities. For example, some old graphics cards can only support a certain number of operation instructions, while some more advanced graphics cards can support more instructions, then we hope to use shaders with lower computational complexity on the old graphics cards, and in advanced Computationally complex shaders are used on graphics cards for better graphics. Excellent picture.

Definitions contained within a SubShader semantic block typically look like this:

SubShader {
    
    
    // 可选的标签
    [tags]
    
    // 可选的状态
    [RenderSetup]

    Pass {
    
    
    }
    // Other Passes
}

A series of Pass and optional state and label settings are defined in SubShader.

Tags

Tags are key-value pairs, and its key and value are both string types. These key-value pairs are the communication bridge between SubShader and rendering engine. They are used to tell Unity's rendering engine how and when I want to render this object.

The format is as follows:

Tags {
    
     "TagName1" = "Value1" "TagName2" = "Value2" }

The label types are as follows:

label type illustrate example child
Queue Control the rendering order and specify which rendering queue the object belongs to. In this way, all transparent objects can be rendered behind all opaque objects (see Chapter 8 for details). We can also customize the rendering queue used to control Object rendering order Tags { “Queue” = “Transparent” }
RenderType Classify the shader, e.g. is this an opaque shader, or is it a transparent shader, etc. This can be used for the Shader Replacement function Tags { “RenderType” = “Opaque” }
DisableBatching Some SubShaders have problems when using Unity's batch processing function, such as using the coordinates in the model space for vertex animation (see Section 11.3 for details). At this time, you can use this label to directly indicate whether to use batch processing for the SubShader Tags { “DisableBatching” = “True” }
ForceNoShadowCasting Controls whether objects using this SubShader will cast shadows (see Section 8.4 for details) Tags { “ForceNoShadowCasting” = “True” }
IgnoreProjector If the tag value is "True", then the object using this SubShader will not be affected by the Projector. Usually used for translucent objects Tags { “IgnoreProjector” = “True” }
CanUseSpriteAtlas When the SubShader is used for sprites, set this tag to "False" Tags { “CanUseSpriteAtlas” = “False” }
PreviewType Indicates how the material panel will preview the material. By default, the material will be displayed as a sphere, we can change the preview type by setting the value of this tag to "Plane" "SkyBox" Tags { “PreviewType” = “Plane” }

It should be noted that the above tags can only be declared in the SubShader, not in the Pass block. Pass blocks can also define labels, but these labels are different label types from SubShader.

status setting

State settings are used to set various states of the graphics card, the common options are as follows

insert image description here

Pass

For Pass, each Pass defines a complete rendering process, but if the number of Pass is too large, rendering performance will often be reduced. Therefore, we should try to use the minimum number of Passes.

States and labels can also be declared in Pass. The difference is that some label settings in SubShader are specific. In other words, these label settings are different from the labels used in Pass. For state settings, the syntax used is the same. However, if we make these settings in SubShader, it will be used for all Pass

Pass semantic block looks like this

Pass {
    
     
    [Name]
    [Tags] 
    [RenderSetup] 
    // Other code
}

The method of defining the Pass name is as follows:

Name "MyPassName"

Passes of other Shaders can be UsePass "MyShader/MYPASSNAME"used through codes in this format

When using the UsePass command, you must use the uppercase name, because Unity will convert all Pass names to uppercase letters.

The settings of Tags and RenderSetup are the same as SubShader. In addition, we can also use fixed pipeline shader commands in Pass.

The tags used by Tags are as follows:

insert image description here
In addition to the ordinary Pass definition above, Unity Shader also supports some special Pass for code reuse or more complex effects.

  • UsePass: As we mentioned before, you can use this command to reuse Pass in other Unity Shaders
  • GrabPass: The Pass is responsible for grabbing the screen and storing the result in a texture for subsequent Pass processing

Fallback

Similar to the default in the switch statement of the C language.

In the case of Fallback, if all the above SubShaders cannot run on this graphics card, then only the content in Fallback can be run.

If you don't write Fallback, if all the above SubShaders can't run on this graphics card, then don't worry about it.

In fact, Fallback also affects shadow casting. When rendering shadow textures, Unity will look for a shadow casting Pass in each Unity Shader. Usually, we don't need to implement a Pass ourselves, because the built-in Shader used by Fallback contains such a general Pass. Therefore, it is very important to set Fallback correctly for each Unity Shader.

other

There are some other semantics like:

  • Use CustomEditor semantics to extend the editing interface
  • Use Category semantics to group commands in the Unity Shader

Guess you like

Origin blog.csdn.net/qq_39377889/article/details/128487266