Getting Started with Unity Shader Part 1

This article is a simple summary after I studied a set of Shader tutorials by Manniu ( http://www.unitytrain.cn/course/96 ). Personally, I feel that this set of tutorials is not aimed at advanced Shader programming. It’s more like the purpose of teaching a man to fish. I will divide it into three parts below: a brief introduction to Shader, basics of graphics, and an introduction to Cg to introduce you to the relevant content of Shader, which can also be regarded as a summary.

1: Brief description of Shader

    a. Let’s first talk about the difference between GPU and CPU. Simply put: GPU is mainly responsible for data processing related to display, while CPU is mainly responsible for operating system and applications. Why not hand over display-related data directly to the CPU for processing? Attached below is the explanation:

image

     b.Shader classification. Shader is translated as "shader" in Chinese, which means: programmable graphics pipeline. Mainly divided into: Vertex Shader and fragment Shader, namely fixed-point Shader and fragment Shader. One of the concepts above is the "graphics pipeline". The simple explanation is: the processing pipeline of the computer processing graphics display.

image

      c.Shader’s mainstream programming language. The mainstream Shader programming languages ​​mainly include HLSL, GLSL, and CG. Let’s briefly talk about the difference: HLSL (High Level Shader Language) is a DX-based work of Microsoft and can only run on the Windows platform. GLSL (OpenGL Shading Language), OpenGL shading language, is a language used for shading programming in OpenGL (OpenGL is a professional graphics program interface that defines a cross-programming language and cross-platform programming interface specification ). It is a cross-platform shading machine language. At this point, we can already find that a more troublesome problem has arisen, that is, our underlying graphics driver limits the upper-level programming language. Once we want to change the graphics driver library, we have to rewrite the entire Shader Files. At this time, CG Emerging as the times require, CG has been further encapsulated on HLSL and GLSL, shielding the dependence of the upper-layer shader language on the underlying graphics library.

      d.Unity Shader. ShaderLab is actually a package of Unity's Shader syntax structure, which supports three types of Shader: surface Shader, Vertex and Fragment Shader and Fixed function shader. Fixed function shader is a relatively "conservative" Shader (with the best compatibility). Vertex and fragment Shader can be written only in HLSL or GLSL or CG language area. Surface shader is a syntax wrapper for Vertex and fragment, and ultimately Will be translated in Vertex and fragment Shader. (For more specific information above, please refer to the official document http://docs.unity3d.com/Manual/index.html )

2: Basics of Graphics

     Personally, I feel that the content of this section is quite valuable for people who have not been exposed to graphics. It clears up previous blind spots about coordinate conversion and rendering processes in Untiy. The following is divided into two subsections to describe the content of this part.

    a.3D mathematical foundation. In fact, 3D mathematics is nothing more than matrix-related operations, which is definitely not a problem for students who have studied line generation. I will briefly introduce it here.

    1. Coordinate systems and vectors. 3D is divided into left-hand and right-hand coordinate systems. You can refer to ( http://www.cnblogs.com/mythou/p/3327046.html ). The schematic diagram is as follows:

image

    2. I won’t go into details about vector-related things. The more important ones are vector dot products and cross products. Here is a reference article ( http://blog.csdn.net/augusdi/article/details/20037851 ).

    3. Matrix correlation. In 3D mathematics, a matrix often represents a transformation, which is also the mathematical principle on which coordinate system transformation relies. Everyone must have heard of the "MVP matrix" in Unity. The MVP matrix is ​​actually a way to transform the coordinate system through matrix operations. In Unity, there are three types of coordinate systems: model coordinate system, world coordinate system, camera coordinate system, and screen coordinate system. This is actually a process for 3D image display: use _ObjectToWorld (i.e. M matrix) to convert the model's own coordinates to the world coordinate system, then use _WorldToCamera (i.e. V matrix) to convert from the world coordinate system to the camera coordinate system, and finally use _ Projection (P matrix) realizes the conversion from the camera to the screen coordinate system, and finally displays the 3D image on the screen. Attached is an article from Baidu Wenku (http://wenku.baidu.com/link?url=A3AGV805UK5rcsEjkaL1h6QjnxsktvCscyNJqaHvfe2cIhwXMam6ZzH4Gxbu_XB7Jd7ripxjd0eR51Q6cP t9xPxTiX3MeHtFaWkwexBlZti ).

Several important operations on matrices include: matrix determinant, matrix transpose, four matrix arithmetic operations, matrix inversion... These knowledge will not be discussed here. Here is a brief introduction to several common matrix transformations.

Rotate matrix around coordinate axes:

2015-12-06_100936

Scaling matrix:

image

projection matrix

image

translation matrix

image

The above are several commonly used matrices. For more information, you have to rely on Du Niang and Google.

     b. The following introduces several simple graphics applications: lighting culling, diffuse reflection and highlight implementation.

     1. Light culling. At this point, we must understand the "concept of normal" (a dotted line that is always perpendicular to a certain plane ). Our perspective is the vector from the object to the camera. If the angle formed by the normal N and the line of sight E is less than 90 degrees, then the observer should be It is approximately on the front side. On the contrary, if it is greater than 90 degrees, it should be on the opposite side of the surface. At this time, the object should not be observed (the normal can be obtained using vector difference multiplication, and the angle can be calculated using vector point multiplication). At this time, it is necessary After eliminating it, the following two pictures briefly explain:

image

     2.漫反射(Diffuse 是投射在几盒体表面上的光向各个方向反射的现象),可以简单理解成光照对物体表面颜色的影响(在Unity中默认的Shader其实就是漫反射加环境光的综合作用)。那么该怎样计算光照对物体颜色的影响程度呢?此时还是需要用到法线,我们使用法线和光向量(必须先标准化)的点乘作为影响该区域颜色的因子,这样再乘以该光源的颜色信息就可以得到对应受光照影响后的颜色了,下面用简图说明一下:

image

    3.高光(Specular 光源照射到物体然后反射到人的眼睛里时,物体上最亮的那个点就是高光),从定义就可以得出高光其实和反射光与视角相互作用形成的,同样的我们在计算高光也是利用同样的原理:由入射光求反射光、再计算反射光和视向量的点乘得出影响因子,最后算出高光强度,简图说明一下:

image

以上是三种比较常见的光照相关的知识,更多资料只能依靠度娘了……

     三、最后一部分的内容就简单介绍一下Unity Shader 的语法基础和一个Demo,更具体的还是要参考Unity官方文档。

     a.ShaderLab 语法基础。Unity 其实是支持上述三种Shader的,此处介绍的是Vertex and fragment Shader ,用的是CG语法。下面先贴一段Untiy 默认的

 

Copy code
//Shader 文件在选择面板以树状结构组织的
Shader "Hidden/NewImageEffectShader"
{
    //这个申明程序中所需要的变量信息
    Properties
    {
        //_MainTex 变量名 ; “Texture” 在Inspector面板上显示的名称 ; 2D 指变量类型 
        // "white" 变量默认值     
        _MainTex ("Texture", 2D) = "white" {}
    }
    // Shader 语法块,一个Shader程序至少有一个SubShader,系统在渲染时会依次调用,
    // 直到找到匹配的SubShader,否则使用最后默认指定的Shader
    SubShader
    {
        // Cull Off:关闭阴影剔除 、  ZWrite : 要将像素的深度写入深度缓存中   
        // Test Always:将当前深度值写到颜色缓冲中 
        Cull Off ZWrite Off ZTest Always
        //渲染通道,固定写法
        Pass
        {
            //Shader 代码段开始
            CGPROGRAM
            //指定顶点Shader入口
            #pragma vertex vert
            //指定片段程序入口
            #pragma fragment frag 
            //引用Unity内置的一些定义
            #include "UnityCG.cginc"
            //自定义结构体
            struct appdata
            {    
                //float4 4维向量、POSITION 语义,相当于告诉渲染引擎,这个变量是代表什么含义
                float4 vertex : POSITION;
                //TEXCOORD0 纹理语义
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                float4 vertex : SV_POSITION;
            };
            //Vertex Shader 对应的入口
            v2f vert (appdata v)    //appdata v 作为参数,渲染引擎会把对应语义的信息传递进来,此处会传递顶点的位置信息和纹理信息
            {
                v2f o;
                //传递进来的顶点坐标是模型坐标系中的坐标值,需要经过矩阵转换车成屏幕坐标
                o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
                o.uv = v.uv;
                //将计算后的结果输出给渲染引擎,底层会根据具体的语义去做对应的处理
                return o;
            }
            //在Properties 中定义的变量需要在此申明一下才能在程序中使用
            sampler2D _MainTex;
                
            //fragment Shader 对应的入口
            fixed4 frag (v2f i) : SV_Target
            {
                fixed4 col = tex2D(_MainTex, i.uv);
                // just invert the colors
                col = 1 - col;
                return col;
            }
            ENDCG
        }
    }    
    //当上述的SubShader无法匹配硬件环境时,会调这个指定的默认Shader
    Fallback "Mobile/VertexLit"
}
Copy code

以上就是对Unity中的Vertex and fragment 中使用CG 语法的简单叙述,下面贴上一个Demo

二:Shader Demo,这里贴上一个简单的Demo,Demo的整个是一个Plane,没有使用任何的贴图,仅仅是使用Shader 改变其顶点和颜色信息实现的。下面是Demo的截图

image

下面贴上该Shader 的源码

Copy code
Shader "Cus/Demo_3"
{
    Properties
    {
        _MainTex ("Texture", 2D) = "white" {}
    }
    SubShader
    {
        // No culling or depth
        Cull Off ZWrite Off ZTest Always

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            
            #include "UnityCG.cginc"
            //自定义结构体,包含位置和颜色语义
            struct v2f
            {
                float4 pos : POSITION;
                float4 col : COLOR;
            };
            //Vertex shader入口,颜色信息也在此一并处理了
            v2f vert (appdata_base v)
            {
                v2f o;
                //计算旋转角度,利用_SinTime.w为旋转角度加上周期变换性质(_SinTime 是Unity提供的内置变量)
                float angle = length(v.vertex)* _SinTime.w;
                //绕Y轴旋转矩阵
                float4x4 RM={
                    float4(cos(angle) , 0 , sin(angle) , 0),
                    float4(0 , 1 ,0 , 0),
                    float4(-1 * sin(angle) , 0 , cos(angle),0),
                    float4(0 , 0 ,0 ,1)
                };
                //利用RM矩阵影响顶点位置信息
                float4 pos = mul(RM , v.vertex);
                //把顶点信息转换到世界坐标系中
                o.pos = mul(UNITY_MATRIX_MVP, pos);
                
                //由顶点到中心点的距离决定颜色信息
                angle = abs(sin(length(v.vertex)));
                o.col = float4(angle , 1 , 0 ,1 );
                 return o; 
            } 
            // In the fragment program, directly return the color information calculated in the vertex Shader 
            float4 frag (v2f v) : color 
            { 
                return v.col; 
            } 
            ENDCG 
        } 
    } 
}
Copy code

Ok, this sharing ends here. Attached below is a Demo written in C# to simulate the 3D image rendering process and the Demo of the above Shader (link: http://pan.baidu.com/s/1c0Yk3KG Password :  1j1k )

Guess you like

Origin blog.csdn.net/qq_25189313/article/details/78075527