Learn Shader Unity Shader Basics

1. How to make full use of Unity Shader to add luster to our game?

Materials and Unity Shaders;

In Unity, we need to use Material (Material) and Unity Shader together to achieve the desired effect. One of the most commonly used processes is:

(1) Create a material;

(2) Create a Unity Shader and assign it to the material created in the previous step;

(3) Assign the material to the object to be rendered;

(4) Adjust the properties of Unity Shader in the material panel to get satisfactory results.

2. Materials in Unity

Materials in Unity need to be combined with a GameObject's Mesh or Particle Systems component to work. It determines what our game object looks like (this of course also requires the cooperation of Unity Shader).

In order to create a material, we can create it by selecting Assets->Create->Material in the menu bar of Unity, or right-click->Create->Material in the Projrct view to create it. When a material is created, it can be assigned to an object.

3. Shader in Unity

(1) In order to distinguish it from the previous common Shader semantics, we collectively refer to the Shader files in Unity as Unity Shader. Unity Shader is very different from the Shader of the rendering pipeline we mentioned earlier.

In order to create a new Unity Shader, we can create it by selecting Assets->Create->Shader in the menu bar of unity, or right-click->Create->Shader in the Project view to create it. Unity provides a total of 4 Unity Shader templates for us to choose --- Standard Surface Shader, Unlit Shader, Image Effect Shader and Compute Shader. Among them, the Standard Surface Shader will generate a surface shader template that includes a standard lighting model, the Unlit Shader will generate a basic vertex/fragment shader that does not contain lighting, and the Image Effect Shader will implement various screen post-processing for us. Effects provide a basic template. Finally, Compute Shader will generate a special Shader file, which is designed to use the parallelism of the GPU to perform some calculations that are not related to the normal rendering pipeline. In general, Standard Surface Shader provides us with a typical implementation of surface shaders. We focus on how to write vertex/fragment shaders in Unity. We usually use Unlit Shader to generate a basic vertex/fragment Meta shader template.

A single Unity Shader can't play any role, it must be combined with the material to have a magical "chemical reaction"! To do this, we can select the Unity Shader we need to use in the drop-down menu at the top of the material panel. When the selection is complete, various properties available for the Unity Shader will appear in the material panel. These properties can be colors, textures, floats, sliders, vectors, etc. When we assign a material to an object in the scene, we can see the visual changes that occur when adjusting the properties.

A Unity Shader is essentially a text file. Similar to many external files in Unity, Unity Shader also has an Import Settings panel, which can be seen by selecting a Unity Shader in the Project view.

On this panel, we can specify the default textures that Unity uses in Default Maps. When any material uses the Unity Shader for the first time, these textures will be automatically assigned to the corresponding properties. In the panel below, Unity will display information related to the Unity Shader, such as whether it is a Surface Shader (Surface Shader), whether it is a Fixed Function Shader (Fixed Function Shader), etc., and some information is It is related to our label settings in Unity Shader, such as whether to cast shadows, the rendering queue used, LOD values, etc.

In addition, the import panel of Unity Shader can also conveniently view information such as the rendering queue (Render queue) used by it, whether to disable batching (Disable batching), and the property list (Properties).

(2) The basis of Unity Shader: ShaderLab

"Any problem in computer science can be solved by adding a layer of abstraction." -----David Wheeler

The process of learning and writing shaders has always been a steep learning curve. Usually, in order to customize the rendering effect, it is often necessary to deal with a lot of files and settings, and these processes can easily wear down the patience of beginners. Moreover, some details often require developers to spend more time to solve.

In order to solve the appeal problem, Unity provides us with a layer of abstraction - Unity Shader. And the way we deal with this layer of abstraction is to use a language provided by Unity specifically for Unity Shader-ShaderLab.

What is ShaderLab?

Unity Shader is a high-level rendering abstraction layer provided by Unity for developers. Unity hopes to make it easier for developers to control rendering in this way.

In Unity, all Unity Shaders are written using ShaderLab. ShaderLab is a declarative language provided by Unity for writing Unity Shaders. It uses some semantics (syntax) nested inside curly braces to describe the structure of a Unity Shader file. These structures contain many data required for rendering, for example, the Properties statement block defines various properties required by the shader, and these properties will appear in the material panel. By design, ShaderLab is similar to the CgFX and Direct3D Effects languages, both of which define everything needed to display a material, not just shader code.

The basic structure of a Unity Shader is as follows:

Shader "ShaderName"{
    Properties{
        //属性
    }
    SubShader{
        //显卡A使用的子着色器
    }
    SubShader{
        //显卡B使用的子着色器
    }
    Fallback "VertexLlit"
}

Behind the scenes, Unity will compile these structures into real code and Shader files according to the platform used, and developers only need to deal with Unity Shader.

Structure of Unity Shader

Above we saw some ShaderLab semantics, such as Properties, SubShader, Fallback and so on. These semantics define the structure of the Unity Shader, thereby helping Unity analyze the Unity Shader file for correct compilation. Below, we explain the semantic meaning and usage of these fundamentals.

Give our Shader a name

The first line of each Unity Shader file needs to specify the name of the Unity Shader through Shader semantics. The name is defined by a string, such as "MyShader". These names appear in the drop-down list in the Material panel when selecting the Unity Shader to use for the material. By adding a slash ("/") to the string, you can control where the Unity Shader appears in the material panel. For example:
Shader "Custom/MyShader"{ }

Then the position of this Unity Shader in the material panel is: Shader -> Custom -> MyShader

The bridge between materials and Unity Shader: Properties

The Properties semantic block contains a series of familiar (properties), these properties will appear in the material panel.

The definition of the Properties semantic block is usually as follows:

Properties {

        Name("display name", PropertyType) = DefaultValue

        Name("display name", PropertyType) = DefaultValue

        //更多属性

}

The developers declare these properties for easy adjustment of various material properties in the material panel. If we need to access them in Shader, we need to use the name of each property (Name). In Unity, the names of these properties usually start with an underscore. Display name (display name) is the name that appears on the material panel. We need to specify its type (PropertyType) for each attribute, and the common attribute types are shown in Table 3.1. In addition, we also need to specify a default value for each property. When we assign the Unity Shader to a material for the first time, these default values ​​are displayed on the material panel.

 For attributes of numeric types such as Int, Float, and Range, the default value is a single number; for attributes such as Color and Vector, the default value is a four-dimensional vector surrounded by parentheses; for 2D, Cube, and 3D Texture types, the definition of the default value is a bit complicated, their default value is specified by a string followed by a curly brace, where the string is either empty, or a built-in texture name, such as "white", "black ", "gray" or "bump". The curly braces were originally used to specify some texture properties. We can control the generation of texture coordinates of fixed pipelines through options such as TexGen CubeReflect and TexGen CubeNormal. But in versions after Unity5.0, these options are removed. If we need similar functions, we need to write the code to calculate the corresponding texture coordinates in the vertex shader.

The following code gives an example showing all property types:

Shader "Custom/ShaderLabProperties"{
    Properties{
        //Numbers and  Sliders
        _Int ("Int",Int) =2
        _Float ("Float",Float) = 1.5
        _Range ("Range",Range(0.0,0.5)) = 3.0
        //Colors and Vectors
        _Color ("Color", Color) = (1,1,1,1)
        _Vector ("Vector", Vector) = (2,3,6,1)
        //Textures
        _2D ("2D",2D) = "" {}
        _Cube ("Cube", Cube) = "white" {}
        _3D ("3D", 3D) = "black" {}
    }
    FallBack "Diffuse"
}

In order to be able to access these properties in Shader, we need to define variables that match the types of these properties in the CG code slice. It should be noted that even if we do not declare these properties in the Properties semantic block, we can also directly define variables in the CG code slice. At this point, we can pass these properties to the Shader through the script. Therefore, the role of the Properties semantic block is only to allow these properties to appear in the material panel.

(3) Heavyweight member: SubShader

Each Unity Shader file can contain multiple SubShader semantic blocks, but at least one more. When Unity needs to load this Unity Shader, Unity will scan all SubShader semantic blocks, and then select the first SubShader that can run on the target platform. If none are supported, Unity will use the Unity Shader specified by the Fallback syntax.

The reason Unity provides this semantics is that different graphics cards have different capabilities. For example, some old graphics cards can only support a certain number of operation instructions, and some more advanced graphics cards can support more instructions, then we want to use shaders with low computational complexity on the old graphics cards, and in advanced graphics cards. Use more computationally complex shaders on your graphics card, while providing better graphics.

Definitions contained within a SubShader semantic block typically look like this:

SubShader{
    //可选的
    [Tags]

    //可选的
    [RenderSetup]

    Pass{
    }
    //other Passes
}

A series of Pass and optional state ([RenderSetup]) and label ([Tags]) settings are defined in SubShader. Each Pass defines a complete rendering process, but if the number of Passes is too large, rendering performance will often be reduced. Therefore, we should try to use the minimum number of Passes. States and tags are also available in Pass life. The difference is that some tags in SubSshader are specific. In other words, these label settings are different from the labels used in Pass. For state settings, the syntax used is the same. However, if we make these settings in SubShader, it will be used for all Passes.

status setting

ShaderLab provides a series of rendering state setting instructions, which can set various states of the graphics card, such as whether to enable mixing/depth testing, etc. Table 3.2 shows the common rendering state setting options in ShaderLab.

When the appeal state rendering is set in the SubShader block, it will be applied to all Passes. If we don't want this (for example, in double-sided rendering, we want to render the north side by excluding the front side in the first pass, and render the front side by excluding the back side in the second pass), we can separate it in the Pass semantic block Make the above settings.

 Tags for SubShaders

SubShader's tag (Tags) is a key-value pair (Key/Value Pair), its key and value are both string types. These key-value pairs are the communication bridge between SubShader and rendering engine. They are used to tell Unity's rendering engine: SubShader how and when I want to render this object.

The structure of the label is as follows:

Tags{"TagName1" = "Value" "TagName2" = "Value2" }

The label types supported by SubShader's label block are shown in Table 3.3.

It should be noted that the above tags can only be declared in the SubShader, but not in the Pass block. Pass blocks can also define labels, but these labels are a different label type than SubShader. This is what we will talk about below.

Pass semantic block

The semantics contained in the Pass semantic block are as follows:

Pass{
    [Name]
    [Tags]
    [RenderSetup]
    //other code
}

 First, we can define the name of the Pass in Pass, for example:

Name "MyPassName"

With this name, we can use ShaderLab's UsePass command to directly use Pass in other Unity Shaders. For example:

UsePass "MyShader/MYPASSNAME"

This improves code reusability. It should be noted that because Unity will convert all Pass names into uppercase letters, you must use uppercase names when using the UsePass command.

Second, we can set the rendering state to Pass. The state setting of SubShader also applies to Pass. In addition to the state settings mentioned above, we can also use fixed pipeline shader commands in Pass.

Pass can also set a label, but its label is different from that of SubShader. These tags are also used to tell the rendering engine how we want to render the object. Table 3.4 shows the tag types used in Pass.

In addition to the ordinary Pass definitions above, Unity Shader also supports some special Passes for code reuse or more complex effects.

UsePass: As we mentioned before, you can use this command to reuse Pass in other Unity Shaders;

GrabPass: The Pass is responsible for grabbing the screen and storing the result in a texture for subsequent Pass processing.

(4) Leave a way out: Fallback

Immediately following each SubShader semantic block may be a Fallback instruction. It is used to tell Unity, "If all the above SubShaders cannot run on this graphics card, then use this lowest-level Shader!"

Its semantics are as follows:

Fallback "name"

//or

Fallback off

As mentioned above, we can tell Unity who this "lowest Unity Shader" is through a string. We can also turn off the Fallback function arbitrarily, but once you do this, what you mean is probably: "If a graphics card can't run all the SubShaders on it, then leave it alone!"

An example of using the Fallback statement is given below:

Fallback "VertexLit"

In fact, Fallback also affects shadow casting. When rendering shadow textures, Unity will look for a shadow casting Pass in each Unity Shader. Usually, we don't need to implement a Pass ourselves, because the built-in Shader used by Fallback contains such a general Pass. Therefore, it is very important to set Fallback correctly for each Unity Shader.

(5) Does ShaderLab have other semantics?

In addition to the above semantics, there are some semantics that are not commonly used. For example, if we are not satisfied with Unity's built-in property types and want to customize the editing interface of the material panel, we can use the CustomEditor semantics to extend the editing interface. We can also use Category semantics to group commands in the Unity Shader.

4. The form of Unity Shader

Above, we talked about the structure of the Unity Shader file and the syntax of ShaderLab. Although Unity Shader can do many things (such as setting rendering state, etc.), its most important task is to specify the code required by various shaders. These shader codes can be written in the SubShader semantic block (the approach of the surface shader), or in the Pass semantic block (the approach of the vertex/fragment shader and the fixed function shader).

In Unity, we can use the following three forms to write Unity Shader. No matter which form is used, the real Shader code needs to be included in the ShaderLab semantic block, as follows:

Shader "MyShader" {
    Properties{
        //所需的各种属性    
    }
    SubShader{
        //真正意义上的Shader代码会出现在这里
        //表面着色器(Surface Shader)或者
        //顶点/片元着色器(Vertex/Fragment Shader)或者
        //固定函数着色器(Fixed Function Shader)
    }
    SubShader{
        //和上一个SubShader类似
    }
}

(1) Unity's darling: surface shaders

A Surface Shader is a type of shader code created by Unity itself. It requires very little code, and Unity does a lot of work behind the scenes, but it's expensive to render. It is essentially the same as the vertex/fragment shader described below. That is, when a surface shader is provided to Unity, it is still converted to the corresponding vertex/fragment shader behind the scenes. We can understand that the surface shader is a higher level of abstraction of the vertex/fragment shader in Unity. The value of its existence is that Unity handles a lot of lighting details for us, so that we don't need to worry about these "annoying things".

A very simple sample surface shader code is as follows:

Shader "Custom/Simple Surface Shader"{
    SubShader{
        Tags {"RenderType" =  "Opaque" }
        CGPROGRAM
        #pragma surface surf Lambert
        struct Input {
            float4 color :COLOR;
        };
        void surf (Input IN,inout SurfaceOutput o){
            o.Albedo = 1;
        }
        ENDCG
    }
    Fallback "Diffuse"
}

As can be seen from the above program, the surface shader is defined between CGPROGRAM and ENDCG in the SubShader semantic block (not the Pass semantic block). The reason is that the surface shader does not require developers to care about how many Passes are used, how to render each Pass, etc. Unity will do these things for us behind the scenes. All we have to do is tell it: "Hey, use these textures for color fill, use this normal texture for normal fill, use Lambert lighting model, and leave me alone!".

The code between CGPROGRAM and ENDCG is written in CG/HLSL, that is to say, we need to embed the CG/HLSL language in the ShaderLab language. It is worth noting that the CG/HLSL here is provided by Unity after packaging. Its syntax is almost the same as the standard CG/HLSL language, but there are still subtle differences. For example, some native functions and usages are not supported by Unity. .

(2) The smartest child: vertex/fragment shader

In Unity we can use CG/HLSL language to write vertex/fragment shader (Vertex/Fragment Shader). They are more complex, but also more flexible.

A very simple vertex/fragment shader sample code is as follows:

Shader "Custom/Simple VertexFragment Shader" {
    SubShader {
        Pass {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            float4 vert(float4 v:POSITION):SV_POSITION{
                return mul(UNITY_MATRIX_MVP, v);
            }
            
            fixed4 frag():SV_Target {
                return fixed4(1.0,0.0,0.0,1.0);
            }
            ENDCG
        }
    }
}

Similar to the surface shader, the code of the vertex/fragment shader also needs to be defined between CGPROGRAM and ENDCG, but the difference is that the vertex/fragment shader is written in the Pass semantic block instead of SubShader. The reason is that we need to define the Shader code that each Pass needs to use. Although we may need to write more code, the benefit is a lot of flexibility. More importantly, we can control the implementation details of rendering. Also, the code between CGPROGRAM and ENDCG here is also written in CG/HLSL.

(3) Abandoned corner: fixed function shader

Both of the above two forms of Unity Shader use a programmable pipeline. For some older devices (whose GPU only supports DirectX 7.0, OpenGL 1.5 or OpenGL ES 1.1), such as iPhone 3, they do not support programmable pipeline shaders, so we need to use fixed function shaders (Fixed Function Shader) to complete the rendering. These shaders can often only do some very simple effects.

A very simple sample code for a fixed-function shader is as follows:

Shader "Tutorial/Basic" {
    Properties {    
        _Color ("Main color",Color ) = (1,0.5,0.5,1)
    }
    SubShader {
        Pass {
            Material {
                Diffuse [_Color]
            }
            Lighting On
        }
    }
}
    

It can be seen that the code of the fixed function shader is defined in the Pass semantic block, which is equivalent to some rendering settings in Pass, as we mentioned before.

For fixed-function shaders, we need to use ShaderLab's syntax (that is, use ShaderLab's rendering setting i command) to write instead of using CG/HLSL.

Since most GPUs now support programmable rendering pipelines, this fixed pipeline programming method has been gradually abandoned. In fact, in Unity5.2, all fixed-function shaders will be compiled into corresponding vertex/fragment shaders by Unity behind the scenes, so fixed-function shaders in the true sense no longer exist.

(4) Which Unity Shader form to choose

So, which one should we choose to write the Unity Shader? Here are some suggestions.

Unless you have a very specific need to use fixed-function shaders, such as needing to run your game on very old devices (these devices are very rare), use shaders with programmable pipelines, that is, surface shaders or vertex shaders / Fragment shader.

If you want to work with various light sources, you may prefer to use a surface shader, but you need to be careful about its performance on mobile platforms.

If you need to use very few lights, for example only one directional light, then using a vertex/fragment shader is a better choice.

Most importantly, if you have a lot of custom rendering effects, choose a vertex/fragment shader.

5. Answer questions

Although a lot of the basics have been covered in the previous content, some common confusions for beginners are still given here, and explanations and explanations are given.

(1) Unity Shader != real Shader

It should be noted that Unity Shader is not equal to the previous Shader, although Unity Shader translates to Unity shader. In Unity, Unity Shader actually refers to a ShaderLab file - a file with .shader as the file suffix on the hard disk.

In Unity Shader (or ShaderLab file), we can do much more than a traditional Shader.

  • In traditional Shaders, we can only write specific types of Shaders, such as vertex shaders, fragment shaders, and so on. In Unity Shader, we can include both the required vertex shader and fragment shader code in the same file.
  • In the traditional Shader, we cannot set some rendering settings, such as whether to enable mixing, depth testing, etc., which are set by the developer in other codes. In Unity Shader, we can complete these settings with one line of specific instructions.
  • In the traditional Shader, we need to write lengthy codes to set the input and output of the shader, and carefully handle the position correspondence of these input and output. In Unity Shader, we only need to declare some properties in a specific statement block, and we can easily change these properties by relying on materials. And for the data that comes with the model (such as vertex position, texture coordinates, discovery, etc.), Unity Shader also provides a direct access method, which does not require developers to code it themselves to pass to the shader.

Of course, in addition to appealing to these advantages, Unity Shader also has some disadvantages. Due to the high encapsulation of Unity Shader, the Shader types and syntax we can write are limited. For some types of Shaders, such as Tessellation Shader, Geometry Shader, etc., Unity's support is poor. Some advanced Shader syntax Unity Shader also does not support.

It can be said that Unity Shader provides a way for developers to control multiple stages in the rendering pipeline at the same time, not just providing Shader code. As developers, we only need to deal with the Unity Shader most of the time, and don't need to care about the underlying implementation details of the rendering engine.

(2) The relationship between Unity Shader and CG/HLSL

As we said before, Unity Shader is written in ShaderLab language, but for surface shader and vertex/fragment shader, we can nest CG/HLSL language inside ShaderLab to write these shader codes. This CG/HLSL code is nested between CGPROGRAM and ENDCG, just like the sample code we saw earlier. Since CG and DX9-style HLSL are almost the same language in terms of writing, CG and HLSL are equivalent in Unity. We can say that CG/HLSL code is another world different from ShaderLab.

Usually, the code fragment of CG is located inside the Pass semantic block, as shown below:

Pass {
    //Pass 的标签和状态设置

    CGPROGRAM
    //编译指令 ,例如
    #pragma vertex vert
    #pragma fragment frag
    
    // CG代码
    
    ENDCG
    //其他一些设置
}

There may be doubts: "Isn't it said that in the surface shader, the CG/HLSL code is written in the SubShader semantic block? Not in the Pass block." Indeed, in the surface shader, the CG/HLSL code is written in the SubShader semantic block, but readers should remember that surface shaders are essentially vertex/fragment shaders, and they don't look like that because surface shaders are provided to developers by Unity on top of vertex/fragment shaders. A layer of abstract encapsulation, but behind the scenes, Unity will still convert it into a multi-pass vertex/fragment shader. We can click the Show generated code button in the import settings panel of Unity Shader to view the generated real vertex/fragment shader code. It can be said that, in essence, Unity Shader has only two forms: vertex/fragment shader and fixed function shader (in versions after Unity 5.2, fixed function shaders will also be converted into vertex/fragment shaders behind the scenes shaders, since essentially only vertex/fragment shaders exist in Unity).

Behind the convenience provided to programmers, the Unity editor will compile these CG fragments into low-level languages, such as assembly language. Usually, Unity will automatically compile these CG fragments to all relevant platforms (the platforms here refer to different rendering platforms, such as Direct3D 9, OpenGL, Direct3D 11, OpenGL ES, etc.). These compilation processes are more complicated, and Unity will use different compilers to convert CG into code for the corresponding platform. This way you won't have to recompile when you switch platforms, and you can get an error message right away if your code fails on certain platforms.

But when the game is released, the game data file only contains the compiled code required by the target platform, and those parts of the code that are not required on the target platform are removed. For example, when releasing to the Mac OS X platform, the corresponding part of the DirectX code will be removed.

(3) Can I use GLSL to write

sure. If you insist: "I don't want to write it in CG/HLSL! I want to write it in GLSL!", but it means that the only platforms you can distribute to are Mac OS X, OpenGL ES 2.0 or Linux, and for PC , Xbox 360 and other platforms that only support DirectX, you give them up.

Based on your insistence on writing Unity Shader in GLSL, how can you write it? Similar to how CG/HLSL needs to be nested between CGPROGRAM and ENDCG, GLSL code needs to be nested between GLSLPROGRAM and ENDCG.

Reference book "Introduction to Unity Shader Essentials"

Guess you like

Origin blog.csdn.net/weixin_43418880/article/details/127097320