[Unity Shader] (1) Lambert model vertex-by-vertex lighting and pixel-by-pixel lighting respectively implement diffuse reflection lighting model

Use per-vertex lighting to achieve diffuse reflection in unityShader

  • Diffuse formula - Lambert model

    First of all, you need to understand the basic lighting model, the diffuse reflection (Diffuse)

    Lambert model
    1
    Lambert model has 4 parameters

  • The color and intensity of the incident light - C light

  • The diffuse reflectance of the material - M diffuse

  • surface normal - n

  • Light source direction - I

In order to avoid the negative value of the dot product of n and I, we need to use max to operate. But in Unity Shader, we can use saturate(x)the function to achieve the same effect.

Among them, xit can be a scalar or vector of operation, that is, float, float2, float3 and other types
saturatewill intercept the incoming interception xwithin the range of [0,1]. If x is a vector, then each component of it will be processed Intercept operation.

  • Lighting principle

Macroscopically, rendering can be divided into two parts:
1. Determine whether the pixel is visible;
2. Determine the lighting calculation of the pixel.

In this blog, we mainly discuss the second point - the calculation of lighting.

  • Some doubts?

There are many formulas for calculating lighting. We use the Lambert model, but how to understand the variable values? Why is there a shaded part? Why are some parts colored differently than others?

  • Q: We discuss why some lights have different colors:
    A: In Unity, assuming that the interval of each light brought by a parallel light source is d, then if it is noon, that is, the parallel light source is shining downward, then we can easily It is known that the distance between rays of light to find the surface of the object is actually d. But if the light is not directly down, but at an oblique angle (in the morning or at dusk), then if the light interval brought by the parallel light is still d, the distance between the objects is actually not d, because in the number of light In limited Unity, an angular offset occurs. The following figure can show this conclusion more intuitively:
    12

  • Q: How does the Lambert formula work in Shader?
    A: In the figure above, cosθ can be obtained by using the light source direction l and the surface normal n . 点积In the Lambert model, the tail part is the dot product of the light source direction and the surface normal. Combined with the following figure, it can be more easily understood:

13
(The picture is taken from Zhihu @俊明)

According to the above figure, it can be divided into three situations in the upper right corner. Combined with the Lambert formula, it can finally be used as a weight to control the output of the color. When the value of the dot product is less than 0, the direct color weight becomes 0, (because it must be controlled in the interval [0, 1]) and finally the color is 0, and the shadow is generated.



The code starts with:

  • per vertex lighting

1. Create an empty scene (only camera and directional light), delete the skybox in the scene.

2. Create a new material and name it Mat_Diffuse.

3. Create a new UnityShader, name it Sha_DIffuseand assign the Shader to Mat_Diffuse.

4. Create a new capsule in the scene, which will Mat_Diffusebe assigned to the object.


5. Save and start editing the Shader code:

①Delete the default code in the Shader created in step 3.

②In order to control the color of the Shader, we need to add diffuse reflection color in Properties

Properties{
    
    
	_Diffuse ( "Diffuse", Color) = (1,1,1,1)
}

(here more similar to C# script Public Color _Diffuse;)


③ Define Pass semantic block in SubShader. Specify the lighting mode of the Pass at the beginning of the Pass:

SubShader{
    
    
	Pass{
    
    
		Tags {
    
    "LightMode"="ForwardBase"}
	}
}

Only when LightMode is defined correctly can we get the built-in lighting variables later.



④ Declare the vertex and fragment shaders and include Lightint.cginc

#pragma vertex vert
#pragma fragment frag
#include "Lightint.cginc"

⑤Because the property is declared in Properties _Diffuse, we need to define a variable that is consistent with the property type to facilitate Shader control

fixed4 _Diffuse;

(here is similar to _Diffuse = new fixed4();initialization)

Since the value range of Diffuse is [0,1] , the type is used fixed. It is worth mentioning that fixedthe type here is actually very common (in UnityShader)

type Accuracy (CG/HLSL)
float The highest precision floating point value. Usually 32 bits are used for storage.
half A floating-point value of medium precision. Usually 16 bits are used for storage. (-60,000 ~ 60,000)
fixed Lowest precision floating point value. Usually 11 bits are used for storage. ( -2.0 ~ 2.0 )

Tip: For the PC side, there is not much difference between the three, because they are all treated as float types. But for the mobile terminal, the priority of the three is: fixed > half > float, and float should appear as little as possible.


⑥ Define the input and output structures of the vertex shader and the fragment shader (the output structure of the shader is consistent with the input structure)

struct a2v {
    
    
    float4 vertex : POSITION;
    float4 normal : NORMAL;
};
struct v2f {
    
    
    float4 pos : SV_POSITION;
    fixed3 color : COLOR;
};

( SV_Yes System Value, it is a new 系统数值语义type of DirectX 10. SV_POSITION will return a cropped vertex position information and return it directly to the screen. If you develop the PS4 platform, you must use SV_POSITION, otherwise it will cause the subdivision shader to fail Work.)

⑦ Improve the vertex shader in the pipeline

v2f vert(a2v v) {
    
    
    v2f o;
    //将定点左边从本地空间转变投影空间
    o.pos = UnityObjectToClipPos(v.vertex);
    //得到环境信息
    fixed3 ambient = UNITY_LIGHTMODEL_AMBIENT.xyz;

    //法线信息由物体空间转变为世界空间
    fixed3 worldNormal = normalize(mul(v.normal, (float3x3)unity_WorldToObject));
    //得到光源在世界坐标中的向量(注意这里是物体指向光源)
    fixed3 worldLight = normalize(_WorldSpaceLightPos0.xyz);
    //漫反射计算公式:
    fixed3 diffuse = _LightColor0.rgb * _Diffuse.rgb * saturate(dot(worldNormal, worldLight));
    o.color = ambient + diffuse;

    return o;
}

At this point, we've got 漫反射颜色_Diffuse and 顶点法线v.normal.
The intensity and color of the light and the direction of the light. 光强和光色Use _LightColor0 to get it. This is because there is only one parallel light, and 光源方向there is no error in using _WorldSpaceLightPos0. If the lighting environment is complex, the correct result may not be obtained.

So far, the four values ​​of the diffuse reflection formula have been obtained. When calculating the dot product, 两个值必须实在同一个坐标空间下, here we perform the dot product operation on the surface normal and the direction of the light source in the world space.

⑧ Implement the fragment shader in the pipeline

Shader code:

fixed4 frag(v2f i) : SV_Target{
    
    
	return fixed4(i.color, 1.0);
}

⑨Final result:

3


Shader "LeonShader/shader_6_4_Diffuse"
{
    
    
    Properties{
    
    
        _Diffuse("Diffuse",Color) = (1,1,1,1)
    }
    SubShader{
    
    
        Pass{
    
    
            Tags {
    
     "LightMode" = "ForwardBase"}

            CGPROGRAM

            #pragma vertex vert
            #pragma fragment frag
            #include "Lighting.cginc"

            fixed4 _Diffuse;

            struct a2v {
    
    
                float4 vertex : POSITION;
                float4 normal : NORMAL;
            };
            struct v2f {
    
    
                float4 pos : SV_POSITION;
                fixed3 color : COLOR;
            };

            v2f vert(a2v v) {
    
    
                v2f o;
                //将定点左边从本地空间转变投影空间
                o.pos = UnityObjectToClipPos(v.vertex);
                //得到环境信息
                fixed3 ambient = UNITY_LIGHTMODEL_AMBIENT.xyz;

                //法线信息由物体空间转变为世界空间
                fixed3 worldNormal = normalize(mul(v.normal, (float3x3)unity_WorldToObject));
                //得到光源在世界坐标中的向量(注意这里是物体指向光源)
                fixed3 worldLight = normalize(_WorldSpaceLightPos0.xyz);
                //漫反射计算公式:
                fixed3 diffuse = _LightColor0.rgb * _Diffuse.rgb * saturate(dot(worldNormal, worldLight));
                o.color = ambient + diffuse;

                return o;
            }

            fixed4 frag(v2f i) : SV_Target{
    
    
                return fixed4(i.color , 1.0);
            }

            ENDCG
        }
    }
    FallBack "Diffuse"
}

Per Pixel Lighting

shader code:


Shader "LeonShader/shader_6_4_Diffuse_Pixel"
{
    
    
    Properties{
    
    
        _Diffuse("Diffuse",Color) = (1,1,1,1)
    }
        SubShader{
    
    
            Pass{
    
    
                Tags {
    
     "LightMode" = "ForwardBase"}
                
                CGPROGRAM

                #pragma vertex vert
                #pragma fragment frag
                #include "Lighting.cginc"

                fixed4 _Diffuse;

                struct a2v {
    
    
                    float4 vertex : POSITION;
                    float4 normal : NORMAL;
                };
                struct v2f {
    
    
                    float4 pos : SV_POSITION;
                    float3 worldNormal : TEXCOORD0;
                };

                v2f vert(a2v v) {
    
    
                    v2f o;
                    //将定点左边从本地空间转变投影空间
                    o.pos = UnityObjectToClipPos(v.vertex);
                    //得到环境信息
                    fixed3 ambient = UNITY_LIGHTMODEL_AMBIENT.xyz;
                    //法线信息由物体空间转变为世界空间
                    o.worldNormal = mul(v.normal, (float3x3)unity_WorldToObject);

                    return o;
                }

                fixed4 frag(v2f i) : SV_Target{
    
    
                    fixed3 ambient = UNITY_LIGHTMODEL_AMBIENT.xyz;
                    fixed3 worldNormal = normalize(i.worldNormal);
                    fixed3 worldLightDir = normalize(_WorldSpaceLightPos0.xyz);
                    fixed3 diffuse = _LightColor0.rgb * _Diffuse.rgb * saturate(dot(worldNormal, worldLightDir));

                    fixed3 color = ambient + diffuse;

                    return fixed4(color , 1.0);
                }

                ENDCG
            }
    }
        FallBack "Diffuse"
}




- think

According to the effect picture at the end of the article (the pixel-by-pixel method is not compared, because the effect of the pixel-by-pixel method is very impressive), in fact, it can be clearly seen that the shadow of each vertex has an obvious aliasing effect, which is of course not what we want. So I made a Sphere with a latitude and longitude of 64 in Blender to compare with the default Sphere in Unity.

As you can see from the picture below, the main reason for the jaggedness is actually the lack of subdivision of the object. In addition, the Sphere I imported (in the upper left corner) has added smooth shading in Blender, which looks very natural.

4

Guess you like

Origin blog.csdn.net/weixin_46840974/article/details/123972167