"Unity Shader Getting Started Essentials" Study Notes Chapter 7 Basic Texture

This article is used to help yourself learn, so only record some personally considered important or not familiar enough.
Author: http: //blog.csdn.net/candycat1992/article/

Chapter 7 Basic Texture

7.1 Single texture

When modeling art people, often use the technology to expand the texture modeling software texture mapping coordinates (texture-mapping coordinates) is stored on each vertex. The texture mapping coordinates define the corresponding 2D coordinates of the vertex in the texture. Usually, these coordinates are represented by a two-dimensional variable (u, v), where u is the horizontal coordinate and v is the vertical coordinate . Therefore, texture mapping coordinates are also called UV coordinates. When I was learning modeling, I often did the work of UV exhibition.

7.1.1 Practice

We usually use a texture to replace the diffuse color of the object.



Shader "Unity Shaders Book/Chapter 7/Single Texture"
{
    Properties
    {
        _Color("Color Tint",Color)=(1,1,1,1)
        //2D是纹理属性的声明方式。初始值是一个字符串后跟一个花括号,“white”是内置纹理的名字, 也就是一个全白的纹理。 
        _MainTex("Main Tex",2D)="white"{}
        _Specular("Specular",Color) = (1,1,1,1)
        _Gloss("Gloss",Range(8.0,256)) = 20
    }
        SubShader
        {


            Pass
            {
                Tags{"LightMode" = "ForwardBase"}

                CGPROGRAM
                #pragma vertex vert
                #pragma fragment frag


                #include "Lighting.cginc"

	            fixed4 _Color;
	            fixed4 _Specular;
	            //使用纹理名_ST 的方式来声明某个纹理的属性,让我们得到纹理缩放和平移值。 
	            //_MainTex ST.xy存储的是缩放值,_MainTex_ST.zw存储的是偏移值。
	            float4 _MainTex_ST;
	            sampler2D _MainTex;
	            float _Gloss;
	
	            struct a2v
	            {
	                float4 vertex : POSITION;
	                float3 normal : NORMAL;
	                //存储模型的第一组纹理坐标 
	                float4 texcoord : TEXCOORD0;
	            };
	
	            struct v2f
	            {
	                float4 pos : SV_POSITION;
	                float3 worldNormal : TEXCOORD0;
	                float3 worldPos : TEXCOORD1;
	                //存储纹理坐标 
	                float2 uv : TEXCOORD2;
	            };
	
	            
	
	            v2f vert (a2v v)
	            {
	                v2f o;
	                o.pos = UnityObjectToClipPos(v.vertex);
	                o.worldNormal = UnityObjectToWorldNormal(v.normal);
	                o.worldPos = mul(unity_ObjectToWorld, v.vertex).xyz;
	                // 使用纹理的属性_MainTex_ST来对顶点纹理坐标进行变换,先缩放后平移。 
	                o.uv = v.texcoord.xy * _MainTex_ST.xy + _MainTex_ST.zw;
	                //或者直接用内置宏o.uv=TRANSFORM_TEX(v.texcoord,_MainTex);
	                //第一个参数是纹理坐标,第二个参数是纹理名。 
	                return o;
	            }
	
	            fixed4 frag(v2f i) : SV_Target
	            {
	                fixed3 worldNormal = normalize(i.worldNormal);
	                fixed3 worldLightDir = normalize(UnityWorldSpaceLightDir(i.worldPos));
	
	                //使用CG的tex2D函数对纹理采样,第一个参数是被采样的纹理名,第二个是float2类型纹理坐标,返回计算得到的纹素值
	                //结果和颜色_Color乘积作为材质的反射率albedo
	                fixed3 albedo = tex2D(_MainTex, i.uv).rgb * _Color.rgb;
	
					//材质的反射率和环境光相乘得到环境光部分,下面同理。 
	                fixed3 ambient = UNITY_LIGHTMODEL_AMBIENT.xyz * albedo;
	
	                fixed3 diffuse = _LightColor0.rbg * albedo * max(0, dot(worldNormal, worldLightDir));
	
	                fixed3 viewDir = normalize(UnityWorldSpaceViewDir(i.worldPos));
	                fixed3 halfDir = normalize(worldLightDir + viewDir);
	                fixed3 specular = _LightColor0.rgb * _Specular.rgb * pow(max(0, dot(worldNormal, halfDir)), _Gloss);
	
	                return fixed4(ambient + diffuse + specular, 1.0);
	            }
	            ENDCG
        }
    }
            Fallback "Specular"
}

7.1.2 Texture properties

After we import a texture resource into Unity, we can adjust its attributes on its material panel, as shown in the figure:
Insert picture description here
the first attribute in the texture panel is the texture type. The reason why we choose the appropriate type for the imported texture is because only in this way can Unity know our intentions, pass the correct texture to the Unity Shader, and in some cases let Unity optimize the texture.
Wrap Mode determines how the texture coordinates will be tiled when they exceed the range of [0, 1]. Wrap Mode has two modes:
one is Repeat. In this mode, if the texture coordinate exceeds 1, then its integer part will be discarded, and the decimal part will be used directly for sampling . The result is that the texture will continue Repeat ; the
other is Clamp , in this mode, if the texture coordinate is greater than 1, then it will be intercepted to 1, if it is less than 0, then it will be intercepted to 0 .
The following figure shows the effect of tiling a texture in two modes: the
Insert picture description here
next attribute is the Filter Mode attribute, which determines which filter mode will be used when the texture is stretched due to transformation .
Filter Mode supports 3 modes: Point , Bilinear and Trilnear . The image filtering effects obtained by them are successively improved, but the required performance is also successively increased.Texture filtering will affect the image quality obtained when the texture is zoomed in or out . It's like when we enlarge a picture to a certain extent, the picture gradually becomes a mosaic effect. The better the filtering effect, the less it looks like a mosaic.
Insert picture description here

7.2 Bump mapping

7.2.1 Height texture

The first technique is to use a height map to achieve bump mapping. The height map stores the intensity value (intensity), which is used to represent the local altitude of the model surface. Thus, color color indicates that the surface of the shallower position more outwardly convex, and the darker the color that the position more inward recess (art according to common sense, where the projection surface is bright, the effect is obvious by the light).
The advantage of this method is very intuitive. We can clearly know the unevenness of a model surface from the height map, but the disadvantage is that the calculation is more complicated . The surface normal cannot be directly obtained in real-time calculation, but the gray scale of the pixel is required. value calculation have therefore requires more properties. The following is a height map:
Insert picture description here

7.2.2 Normal texture

And what is stored in the normal texture is the normal direction of the surface . Since the component range of the normal direction is [-1, 1], and the component range of the pixel is [0,1], we need to do a mapping. The usual mapping is:
Insert picture description here
for the normals of the model vertices, they It is defined in the model space, so a direct idea is to store the surface normals in the modified model space in a texture . This texture is called the object-space normal texture of the model space. normal map).
In actual production, another coordinate space is often used, that is , the tangent space of the model vertex to store the normal. For each vertex of the model, it has its own tangent space. The origin of this tangent space is the vertex itself, and the z-axis is the normal direction of the vertex (n), and the x-axis is the tangent direction of the vertex (t) , And the y-axis can be obtained by the cross product of normal and tangent, which is also called sub-tangent (bitangent, b) or sub-normal , as shown in the figure:
Insert picture description here
This kind of texture is called the normal texture of tangent space (tangent- space normal map). The following figure shows the normal texture in the model space and the tangent space respectively: the normal texture in the
Insert picture description here
tangent space looks almost all light blue. This is because the coordinate space of each normal direction is different, that is , the tangent space of each point on the surface . This kind of normal texture actually stores the normal perturbation direction of each point in its respective tangent space .
In modern computer graphics, RGB is the most common color coding method, and normal map is to store the normal vector in color on the map, That is, the R channel is used to store the X value, the G channel is the Y value, and the B channel is the Z value.
In other words, if the normal direction of a point does not change, then in its tangent space, the new normal direction is the z-axis direction, that is, the value is (0, 0, 1), which is stored in the texture after mapping It corresponds to RGB(0.5, 0.5, 1) light blue (because the value interval of rgb is [0,1], and the value interval of normal is [-1,1]).

Seeing this, some people who are new to contact may still have difficulty understanding the meaning of normal texture. In my own words, the normal texture picture is not a real texture, but the normal direction of each point in its respective tangent space, and this normal direction is not the true normal of the point on the model The direction is a normal direction used to calculate the simulated bump lighting effect. When learning modeling before, I always emphasized that the number of sides of the model should not be too many, otherwise it will affect the performance of the game. So how to achieve realistic lighting effects in the game? It is necessary to simulate the bumpy texture of the texture.
For example, a wall made of bricks, his model is generally a very flat surface. If you want to render the bump effect of each brick, take a brick as an example. Its left edge is convex to the left, that is, the plane here is rotated to the left, so on the normal Corresponding to this position in the texture, the normal direction he stored is offset from the normal direction perpendicular to the wall by a certain angle to the left. When calculating the lighting, the normal information is used to calculate it, and the brick will be convex. Effect.

There are more advantages to using tangent space:

  1. The degree of freedom is high. The normal texture in the model space records the absolute normal information, which can only be used for the model when it was created, and the effect when applied to other models is completely wrong. The normal texture in the tangent space records relative normal information, which means that even if the texture is applied to a completely different mesh, a reasonable result can be obtained (such as the material used in the game engine) .
  2. UV animation can be performed . For example, we can move the UV coordinates of a texture to achieve a bump movement effect , but using the normal texture in the model space will get completely wrong results. The reason is the same as above. This UV animation is often used on objects such as water or volcanic lava.
  3. The normal texture can be reused . For example, for a brick, we can use only one normal texture to use all 6 faces. The reason is the same as above.
  4. Can be compressed . Since the Z direction of the normal in the normal texture in the tangent space is always the positive direction, we can only store the XY direction and derive the Z direction. Since the normal texture in the model space is possible in every direction, the values ​​of 3 directions must be stored and cannot be compressed.

7.2.3 Practice

Since the normal stored in the normal texture is the direction in the tangent space, we usually have two choices:
one option is to perform the lighting calculation in the tangent space , at this time we need to transform the light direction and the viewing angle direction to the tangent space under;
another option is to perform lighting calculations in world space , then we need the normal direction to preclude the thus obtained transformed to world space , and then the direction of illumination and viewing direction are calculated in world space.
In terms of efficiency, the first method is often better than the second method, but from the perspective of versatility, the second method is better than the first method. In this section, the above two methods will be implemented in sequence.

1. Calculate in tangent space

The basic idea: in the fragment shader, the normal in the tangent space is obtained through texture sampling, and then calculated with the viewing angle direction and the light direction in the tangent space to obtain the final lighting result.
For this, we first need to transform the viewing angle and light direction from model space to tangent space in the vertex shader , that is, we need to know the transformation matrix from model space to tangent space. The inverse matrix of this transformation matrix, that is, the transformation matrix from the tangent space to the model space, is very easy to obtain. In the vertex shader, we press the tangent (x axis), the secondary tangent (y axis), and the normal (z axis) The order of is arranged in columns to get (see section 4.6.2 for details of mathematical principles). The matrix from tangent space to model space is its inverse matrix.





Shader "Unity Shaders Book/Chapter 7/Normal Map In Tagent Space"
{
    Properties
    {
        _Color("Color Tint",Color)=(1,1,1,1)
        //2D是纹理属性的声明方式。初始值是一个字符串后跟一个花括号,“white”是内置纹理的名字, 也就是一个全白的纹理。 
        _MainTex("Main Tex",2D)="white"{}
        //对于法线纹理_BumpMap,使用bump作为它的默认值。bump是Unity内置的法线纹理,当没有提供任何法线纹理时,bump就对应了模型自带的法线信息。
        _BumMap("Normal Map",2D)="bump"{}
        //_BumpScale用于控制凹凸程度,当它为0时,意味着该法线纹理不会对光照产生任何影响。
        _BumpScale("Bump Scale",Float)=1.0
        _Specular("Specular",Color) = (1,1,1,1)
        _Gloss("Gloss",Range(8.0,256)) = 20
    }
    SubShader
    {


        Pass
        {
            Tags{"LightMode" = "ForwardBase"}

            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag


            #include "Lighting.cginc"

            fixed4 _Color;
            fixed4 _Specular;
            //使用纹理名_ST 的方式来声明某个纹理的属性,让我们得到纹理缩放和平移值。 
            //_MainTex ST.xy存储的是缩放值,_MainTex_ST.zw存储的是偏移值。
            float4 _MainTex_ST;
            sampler2D _MainTex;
            sampler2D _BumpMap;
            float4 _BumpMap_ST;
            float _BumpScale;
            float _Gloss;

            struct a2v
            {
                float4 vertex : POSITION;
                float3 normal : NORMAL;
                //切线空间是由顶点法线和切线构建出的一个坐标空间,因此需要得到顶点的切线信息。
                //和法线方向normal不同,tangent的类型是float4,不是float3, 
				//因为需要使用tangent.w分量来决定切线空间中的副切线坐标轴的方向性。
                float4 tangent : TANGENT;
                float4 texcoord : TEXCOORD0;
            };

            struct v2f
            {
                float4 pos : SV_POSITION;
                //由于使用了两张纹理,因此需要存储两个纹理坐标,所以uv变量定义为float4类型。
				//其中xy分量存储了_MainTex的纹理坐标,而zw分量存储了_BumpMap的纹理坐标。 
				float4 uv : TEXCOORD0;
				float3 lightDir : TEXCOORD1;
				float3 viewDir : TEXCOORD2;
            };

            

            v2f vert (a2v v)
            {
                v2f o;
                o.pos = UnityObjectToClipPos(v.vertex);

                // 使用纹理的属性来对顶点纹理坐标进行变换,先缩放后平移。 
                o.uv.xy = v.texcoord.xy * _MainTex_ST.xy + _MainTex_ST.zw;
                o.uv.zw = v.texcoord.xy * _BumpMap_ST.xy + _BumpMap_ST.zw;
                //或者直接用内置宏o.uv=TRANSFORM_TEX(v.texcoord,_MainTex);
                //第一个参数是纹理坐标,第二个参数是纹理名。 
                
                //计算副切线
		//		float3 binormal = cross( normalize(v.normal), normalize(v.tangent.xyz) ) * v.tangent.w;
				//把模型空间下切线方向、 副切线方向和法线方向按行排列来得到从模型空间到切线空间的变换矩阵rotation
		//		float3x3 rotation = float3x3(v.tangent.xyz,v.binormal,v.normal);
				//或者使用内置宏,直接计算得到rotation
				TANGENT_SPACE_ROTATION; 
				
				//获得切线空间的灯光方向
				o.lightDir = mul(rotation, ObjSpaceLightDir(v.vertex)).xyz;
				//获得切线空间的视角方向
				o.viewDir = mul(rotation, ObjSpaceViewDir(v.vertex)).xyz;

                
                return o;
            }

            fixed4 frag(v2f i) : SV_Target
            {

                fixed3 tangentLightDir = normalize(i.lightDir);
				fixed3 tangentViewDir = normalize(i.viewDir);

                //使用tex2D对法线纹理采样,第一个参数是被采样的纹理名,第二个是纹理坐标,返回把法线经过映射后得到的像素值
                fixed4 packedNormal = tex2D(_BumpMap, i.uv.zw);
                fixed3 tangentNormal;

				// 如果没有在Unity里把该法线纹理的类型设置成Normal map,需要反映射 
//				tangentNormal.xy = (packedNormal.xy * 2 - 1) * _BumpScale;
//				tangentNormal.z = sqrt(1.0 - saturate(dot(tangentNormal.xy, tangentNormal.xy)));
				
				// Or mark the texture as "Normal map", and use the built-in funciton
				//使用UnpackNormal得到正确的法线方向 
				tangentNormal = UnpackNormal(packedNormal);
				//利用_BumpScale 控制凹凸程度  
				tangentNormal.xy *= _BumpScale;
				//由于法线都是单位矢量,因此tangentNormal.z 可以由tangentNormal.xy计算而得
				tangentNormal.z = sqrt(1.0 - saturate(dot(tangentNormal.xy, tangentNormal.xy)));
				
				fixed3 albedo = tex2D(_MainTex, i.uv).rgb * _Color.rgb;
				
				fixed3 ambient = UNITY_LIGHTMODEL_AMBIENT.xyz * albedo;
				
				//下边的计算都是基于切线空间进行
				fixed3 diffuse = _LightColor0.rgb * albedo * max(0, dot(tangentNormal, tangentLightDir));

				fixed3 halfDir = normalize(tangentLightDir + tangentViewDir);
				fixed3 specular = _LightColor0.rgb * _Specular.rgb * pow(max(0, dot(tangentNormal, halfDir)), _Gloss);

	
	                return fixed4(ambient + diffuse + specular, 1.0);
            }
            ENDCG
    	}
    }
            Fallback "Specular"
}

The effect is as follows:
Insert picture description here

2. Calculate in world space

The basic idea of ​​this method is to calculate the transformation matrix from tangent space to world space in the vertex shader and pass it to the fragment shader. Finally, only need to transform the normal direction in the normal texture from the tangent space to the world space in the fragment shader .



Shader "Unity Shaders Book/Chapter 7/Normal Map In WorldSpace"
{
    Properties
    {
        _Color("Color Tint",Color)=(1,1,1,1)
        //2D是纹理属性的声明方式。初始值是一个字符串后跟一个花括号,“white”是内置纹理的名字, 也就是一个全白的纹理。 
        _MainTex("Main Tex",2D)="white"{}
        //对于法线纹理_BumpMap,使用bump作为它的默认值。bump是Unity内置的法线纹理,当没有提供任何法线纹理时,bump就对应了模型自带的法线信息。
        _BumpMap("Normal Map",2D)="bump"{}
        //_BumpScale用于控制凹凸程度,当它为0时,意味着该法线纹理不会对光照产生任何影响。
        _BumpScale("Bump Scale",Float)=1.0
        _Specular("Specular",Color) = (1,1,1,1)
        _Gloss("Gloss",Range(8.0,256)) = 20
    }
    SubShader
    {


        Pass
        {
            Tags{"LightMode" = "ForwardBase"}

            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag


            #include "Lighting.cginc"

            fixed4 _Color;
            fixed4 _Specular;
            //使用纹理名_ST 的方式来声明某个纹理的属性,让我们得到纹理缩放和平移值。 
            //_MainTex ST.xy存储的是缩放值,_MainTex_ST.zw存储的是偏移值。
            float4 _MainTex_ST;
            sampler2D _MainTex;
            sampler2D _BumpMap;
            float4 _BumpMap_ST;
            float _BumpScale;
            float _Gloss;

            struct a2v
            {
                float4 vertex : POSITION;
                float3 normal : NORMAL;
                //切线空间是由顶点法线和切线构建出的一个坐标空间,因此需要得到顶点的切线信息。
                //和法线方向normal不同,tangent的类型是float4,不是float3, 
				//因为需要使用tangent.w分量来决定切线空间中的副切线坐标轴的方向性。
                float4 tangent : TANGENT;
                float4 texcoord : TEXCOORD0;
            };

            struct v2f
            {
                float4 pos : SV_POSITION;
                //由于使用了两张纹理,因此需要存储两个纹理坐标,所以uv变量定义为float4类型。
				//其中xy分量存储了_MainTex的纹理坐标,而zw分量存储了_BumpMap的纹理坐标。 
				float4 uv : TEXCOORD0;
				//切线空间到世界空间的变换矩阵
				//因为一个插值寄存器最多只能存储float4大小变量,因此3x4矩阵需要三个float4变量。 
				float4 TtoW0 : TEXCOORD1;
				float4 TtoW1 : TEXCOORD2;
				float4 TtoW2 : TEXCOORD3; 
            };

            

            v2f vert (a2v v)
            {
                v2f o;
                o.pos = UnityObjectToClipPos(v.vertex);

                // 使用纹理的属性来对顶点纹理坐标进行变换,先缩放后平移。 
                o.uv.xy = v.texcoord.xy * _MainTex_ST.xy + _MainTex_ST.zw;
                o.uv.zw = v.texcoord.xy * _BumpMap_ST.xy + _BumpMap_ST.zw;

				//计算世界空间下的顶点切线、副切线和法线的矢量表示(分别对应切线空间下xyz轴) 
 				fixed3 worldPos = mul(unity_ObjectToWorld,v.vertex).xyz;
 				fixed3 worldNormal = UnityObjectToWorldNormal(v.normal);
 				fixed3 worldTangent = UnityObjectToWorldDir(v.tangent.xyz);
 				fixed3 worldBinormal = cross(worldNormal,worldTangent)*v.tangent.w;

				//把它们按列摆放得到从切线空间到世界空间的变换矩阵
				o.TtoW0 = float4(worldTangent.x, worldBinormal.x, worldNormal.x, worldPos.x);
				o.TtoW1 = float4(worldTangent.y, worldBinormal.y, worldNormal.y, worldPos.y);
				o.TtoW2 = float4(worldTangent.z, worldBinormal.z, worldNormal.z, worldPos.z);

                
                return o;
            }

            fixed4 frag(v2f i) : SV_Target
            {

                //获取世界空间下的坐标
				float3 worldPos = float3(i.TtoW0.w,i.TtoW1.w,i.TtoW2.w);
				 
				//计算世界空间下的光照和视线方向
				fixed3 lightDir = normalize(UnityWorldSpaceLightDir(worldPos));
				fixed3 viewDir = normalize(UnityWorldSpaceViewDir(worldPos));

                //使用tex2D对法线纹理采样,第一个参数是被采样的纹理名,第二个是纹理坐标,返回把法线经过映射后得到的像素值
                //使用UnpackNormal得到正确的法线方向(对纹理解码) 
                fixed3 bump = UnpackNormal(tex2D(_BumpMap, i.uv.zw));

				//利用_BumpScale 控制凹凸程度  
				bump.xy *= _BumpScale;
				
				//由于法线都是单位矢量,因此bump.z 可以由bump.xy计算而得
				bump.z = sqrt(1.0 - saturate(dot(bump.xy, bump.xy)));
				//将法线从切线空间变换到世界空间下,通过点乘实现矩阵每一行和法线相乘。
				bump = normalize(half3(dot(i.TtoW0.xyz,bump),dot(i.TtoW1.xyz, bump), dot(i.TtoW2.xyz, bump))); 
				
				fixed3 albedo = tex2D(_MainTex, i.uv).rgb * _Color.rgb;
				
				fixed3 ambient = UNITY_LIGHTMODEL_AMBIENT.xyz * albedo;
				
				//下边的计算都是基于世界空间进行
				fixed3 diffuse = _LightColor0.rgb * albedo * max(0, dot(bump, lightDir));

				fixed3 halfDir = normalize(bump + viewDir);
				fixed3 specular = _LightColor0.rgb * _Specular.rgb * pow(max(0, dot(bump, halfDir)), _Gloss);

	
	                return fixed4(ambient + diffuse + specular, 1.0);
            }
            ENDCG
    }
    }
            Fallback "Specular"
}

The effect is the same as above.

7.3 Gradient texture

In the previous calculation of diffuse lighting, we used the dot product of the surface normal and the light direction to multiply the reflectivity of the material to get the diffuse lighting of the surface. But sometimes, we need more flexibility to control the lighting results.
In a paper, the author mentioned a coloring technique based on cool-to-warm tones to get an illustration style rendering effect. Using this technology, you can ensure that the contour of the object is more obvious than the traditional diffuse lighting used before, and it can provide a variety of color changes.
As shown in the figure below, different gradient textures are used to control diffuse lighting. The gradient texture used in each picture is given in the lower left corner.
Insert picture description here

Shader "Unity Shader Books/Chapter7/Ramp Texture"
{
    Properties
    {
        _RampTex ("Ramp Tex", 2D) = "white" {}
        _Color ("Color Tint",Color) = (1,1,1,1)
        _Specular ("Specular",Color) = (1,1,1,1)
        _Gloss("Gloss",Range(8.0,256)) = 20
        
    }
    SubShader
    {
        Tags { "LightMode"="ForwardBase" }

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag

            #include "Lighting.cginc"
            
            sampler2D _RampTex;
            float4 _RampTex_ST;
            fixed4 _Color;
            fixed4 _Specular;
            float _Gloss;
            

            struct a2v
            {
                float4 vertex : POSITION;
                float3 normal : NORMAL;
                float4 texcoord : TEXCOORD0;
                
            };

            struct v2f
            {
                float3 worldPos : TEXCOORD0;
                float4 pos : SV_POSITION;
                float3 worldNormal : TEXCOORD1;
                float2 uv : TEXCOORD2;
            };



            v2f vert (a2v v)
            {
                v2f o;
                o.pos = UnityObjectToClipPos(v.vertex);
                
                o.worldNormal = UnityObjectToWorldNormal(v.normal);
                o.worldPos = mul(unity_ObjectToWorld,v.vertex).xyz;
                
                //使用内置的TRANSFORM_TEX宏来计算经过平铺和偏移后的纹理坐标。
                o.uv = TRANSFORM_TEX(v.texcoord, _RampTex);

                return o;
            }

            fixed4 frag (v2f i) : SV_Target
            {
            	fixed3 worldNormal = normalize(i.worldNormal);
                fixed3 worldLightDir = normalize(UnityWorldSpaceLightDir(i.worldPos));
                
                fixed3 ambient = UNITY_LIGHTMODEL_AMBIENT.xyz;
                
                fixed halfLambert = 0.5 * dot(worldNormal,worldLightDir) + 0.5;
                //由于RampTex实际就是一个一维纹理(它在纵轴方向上颜色不变),因此纹理坐标的u和v方向都使用halfLamberto
                fixed3 diffuseColor = tex2D(_RampTex,fixed2(halfLambert,halfLambert)).rgb * _Color.rgb;
                
                fixed3 diffuse = _LightColor0.rgb * diffuseColor;
                
                fixed3 viewDir = normalize(UnityWorldSpaceViewDir(i.worldPos));
                fixed3 halfDir = normalize(worldLightDir + viewDir);
                fixed3 specular = _LightColor0.rgb * _Specular.rgb * pow(max(0,dot(worldNormal,halfDir)),_Gloss);
                
                return fixed4(ambient + diffuse +specular,1.0);
            }
            ENDCG
        }
    }
    	Fallback "Specular"
}

You may not understand how the gradient texture is realized here. In fact, the focus of the gradient texture is the process of texture sampling:

 fixed3 diffuseColor = tex2D(_RampTex,fixed2(halfLambert,halfLambert)).rgb * _Color.rgb;

In the above code, the halfLambert model has been used to calculate the halfLambert that is mapped to the halfLambert between [0,1]. Then use halfLambert as the value of the uv coordinate (in fact, the value of the v coordinate is meaningless, because the color in the vertical axis direction is unchanged), that is, the stronger the light, the larger the value of halfLambert, and the sampled texture abscissa The closer to 1, that is, the light position of the following texture coordinate map, the lighting effect of the gradual texture is realized, and vice versa.
Insert picture description here

7.4 Mask texture

What is a mask? In simple terms, masks allow us to protect certain areas from certain modifications .
For example, in the previous implementation, we applied the specular reflection to all parts of the model surface, that is, all pixels use the same size of specular intensity and specular index. But sometimes, we hope that some areas of the model surface reflect more strongly, and some areas are weaker . In order to get a more delicate effect, we can use a mask texture to control the lighting.
Another common application is to mix multiple images when making terrain materials , such as the texture of grass, the texture of stones, the texture of bare land, etc. The use of mask textures can control how these textures are mixed.

7.4.1 Practice

Shader "Unlit/Chapter7-MaskTexture"
{
    Properties
    {
        _Color("Color Tint",Color) = (1,1,1,1)
        _MainTex("Main Tex", 2D) = "white"{}
        _BumpMap("Normal",2D) = "bump"{}
        _BumpScale("Bump Scale",Float) = 1.0
        //_SpecularMask是高光反射遮罩纹理
        _SpecularMask("Specular Mask",2D) = "White"{}
        //_SpecularScale是用于控制遮罩影响度的系数
        _SpecularScale("Specular Scale",Float) = 1.0
        _Specular("Specular",Color) = (1,1,1,1)
        _Gloss("Gloss",Range(8.0,256)) = 20
    }
    SubShader
    {
        
        

        Pass
        {
            Tags { "LightMode"="ForwardBase" }
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag


            #include "Lighting.cginc"

            fixed4 _Color;
            sampler2D _MainTex;
            float4 _MainTex_ST;
            sampler2D _BumpMap;
            float _BumpScale;
            sampler2D _SpecularMask;
            float _SpecularScale;
            fixed4 _Specular;
            float _Gloss;


            struct a2v
            {
                float4 vertex : POSITION;
                float3 normal : NORMAL;
                float4 tangent : TANGENT;
                float4 texcoord : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                float4 pos : SV_POSITION;
                float3 lightDir : TEXCOORD1;
                float3 viewDir : TEXCOORD2;
            };



            v2f vert (a2v v)
            {
                v2f o;
                
                o.pos=UnityObjectToClipPos(v.vertex);
                
                o.uv.xy = v.texcoord.xy * _MainTex_ST.xy + _MainTex_ST.zw;

                TANGENT_SPACE_ROTATION;
                o.lightDir = mul(rotation, ObjSpaceLightDir(v.vertex)).xyz;
                o.viewDir = mul(rotation, ObjSpaceViewDir(v.vertex)).xyz;

                return o;
            }

            fixed4 frag(v2f i) : SV_Target
            {
                fixed3 tangentLightDir = normalize(i.lightDir);
                fixed3 tangentViewDir = normalize(i.viewDir);

                fixed3 tangentNormal = UnpackNormal(tex2D(_BumpMap, i.uv));
                tangentNormal.xy *= _BumpScale;
                tangentNormal.z = saturate(sqrt(1.0 - dot(tangentNormal.xy, tangentNormal.xy)));

                fixed3 albedo = tex2D(_MainTex, i.uv).rgb * _Color.rgb;

                fixed3 ambient = UNITY_LIGHTMODEL_AMBIENT.xyz * albedo;

                fixed3 diffuse = _LightColor0.rbg * albedo * max(0, dot(tangentNormal, tangentLightDir));

                fixed3 halfDir = normalize(tangentLightDir + tangentViewDir);

                //对遮罩纹理进行采样
                //由于本例使用的遮罩纹理中每个纹素的rgb分量其实都是一样的, 表明了该点对应的高光反射强度,
                //在这里我们选择使用r分量来计算掩码值。
                fixed specularMask = tex2D(_SpecularMask, i.uv).r * _SpecularScale;
                //在之前的高光反射计算最后加上与遮罩相乘
                fixed3 specular = _LightColor0.rgb * _Specular.rgb * pow(max(0, dot(tangentNormal, halfDir)), _Gloss) * specularMask;

                return fixed4(ambient + diffuse + specular,1.0);
            }
            ENDCG
        }
    }
            Fallback "Specular"
}

The effect is as follows:
Insert picture description here

Guess you like

Origin blog.csdn.net/weixin_46124783/article/details/114712417