[Project Record] BRDF Implementation Based on Unity

Background of the project

  This project should have been done when the summer vacation of last year (2022) was close to the beginning of school, and it was also the time when I officially started to get started with graphics. Know what BRDF is. Because the project is in a hurry to make progress, from the perspective of just realizing the project requirements, writing Shader and implementing the BRDF model are quite simple. But in the case of lack of understanding of basic knowledge, some problems did appear, such as the wrong included angle when the Fresnel item was implemented at the beginning, but because the Fresnel effect is not obvious in this project, it is even very long Time didn't find out...

  There are quite a few development requirements that have nothing to do with BRDF in the original project. This article mainly records the implementation of the BRDF part, which is basically some basic content. If there is enough time, there should be an article dedicated to the principle of PBR materials in the follow-up, which is also the part of follow-up study last year.

A brief introduction to BRDF

Comparison of four BRDF models implemented in the project: Blinn-Phong, Cook-Torrance, GGX, Ward

  The rendered color of an object is affected by its own material properties and lighting. The rendering equation can be used to describe how to color a point. If we consider from ω o \omega_{o}oholook towards pppoint p , toppp Definition:
L o ( p , ω o ) = L e ( p , ω o ) + ∫ Ω fr ( p , ω i , ω o ) L i ( p , ω i ) n ⋅ ω id ω i L_ {o}\left( {p,\omega_{o}}\right) = L_{e}\left( {p,\omega_{o}}\right) + \int_{\omega}^{~}{ f_{r}\left( {p,\omega_{i},\omega_{o}}\right)L_{i}\left( {p,\omega_{i}}\right)n \cdot \omega_{ i}\mathbb{d}\omega_{i}}Lo(p,oho)=Le(p,oho)+Oh fr(p,ohi,oho)Li(p,ohi)nohidωi

Then the color of the point is divided into two parts: self-illumination L e L_{e}Leand light L i from all other directions L_{i}Liassigned to ω o \omega_{o}ohodirection. fr f_{r} in the formulafris to describe the ppPoint p is atω i \omega_{i}ohiHow much energy received in the direction is allocated to ω o \omega_{o}ohoFunction BxDF in direction, including:

  • BRDF, bidirectional reflection distribution function, the name is abstract but very vivid, and it is also the content of this article
  • BTDF, bidirectional transmission distribution function
  • BSDF, bidirectional scattering distribution function, namely BRDF+BTDF

  Ignoring self-illumination, without considering indirect lighting, the light in other directions is the light from the light source (simple lighting models such as parallel light and point light source), that is, the final result is to calculate each light source that can affect the point Lighting contribution to the point (usually ambient lighting is added to make up for the lack of indirect lighting to avoid dark places being completely black).

BRDF implementation

  Create an Unlit shader in Unity, the basic framework of the code already exists, and then add some required parameters, and implement the corresponding BRDF model in ps according to the formula (be careful not to implement it in vs, except that vs is affected by the number of vertices in the scene The difference in performance, the final result is not accurate after calculating the linear interpolation of the lighting result in vs). In the original project, five BRDF models, Blinn-Phong, Neumann-Phong, Cook-Torrance, GGX, and Ward, were implemented. Here we only introduce the implementation of the GGX model that has been adopted more in recent years. GGX is divided into diffuse reflection and Specular reflection term, the formula is as follows:
fr = fdiffuse + fspecular = kdiff π + kspec F ( i , h ) G ( i , o , h ) D ( h ) 4 ∣ i ⋅ n ∣ | o ⋅ n ∣ f_{r} = f_{diffuse} + f_{specular} = \frac{k_{diff}}{\pi} + k_{spec}\frac{F\left( { {\mathbf{i}},{\mathbf{h } }} \right)G\left( { {\mathbf{i}},{\mathbf{o}},{\mathbf{h}}} \right)D({\mathbf{h}})}{\ left. 4\left| { {\mathbf{i}} \cdot {\mathbf{n}}} \right| \middle| {\mathbf{o}} \cdot {\mathbf{n}} \right|}fr=fdiffuse+fspecular=Pikdiff+kspec4inonF(i,h)G(i,o,h)D(h)

where n \mathbf{n}n is the normal,i \mathbf{i}i is the direction of incidence (when only direct lighting is considered, it can be simply understood as the direction of the light sourceL \mathbf{L}L ),of \mathbf{o}o is the outgoing direction (when only direct lighting is considered, it can be simply understood as the viewing directionV \mathbf{V}V ),are all outward unit vectors along the surface of the object,h \mathbf{h}hi \mathbf{i}i ando \mathbf{o}The half-width vector of o , i.e. normalized( i + o ) (\mathbf{i}+\mathbf{o})(i+o)

   F F F is the Fresnel term, and its approximate solution is usually used, herem \mathbf{m}m can be thought of ash \mathbf{h}The meaning of h is the same, and the specific reason will be discussed in the subsequent article on PBR material analysis:
FS chlick ( i , m ) = F 0 + ( 1 − F 0 ) ( 1 − ( i ⋅ m ) ) 5 F_{Schlick}\left( { {\mathbf{i}},{\mathbf{m}}} \right) = F_{0} + \left( 1 - F_{0} \right)\left( 1 - ({\mathbf{i}} \cdot {\mathbf{m}}) \right)^{5}FSch l i c k _ _(i,m)=F0+(1F0)(1(im))5

Normal distribution function DDD and the shadow masking functionGGDetermine the
infinitive: D ( m ) = α 2 π cos 4 θ ( α 2 + tan 2 θ ) 2 , cos θ = ( n ⋅ m ) D({\mathbf{m}}) = \frac{\ alpha^{2}}{\pi{cos}^{4}\theta\left( \alpha^{2} + {tan}^{2}\theta\right)^{2}},cos\theta = ({\mathbf{n}} \cdot {\mathbf{m}})D(m)=πcos4 i( a2+tan2 i)2a2,cosθ=(nm)

G ( i , o , m ) = 4 ( 1 + 1 + α 2 tan 2 θ 1 ) ( 1 + 1 + α 2 tan 2 θ 2 ) , cos θ 1 = ( i ⋅ n ) , cos θ 2 = ( o ⋅ n ) G\left( { {\mathbf{i}},{\mathbf{o}},{\mathbf{m}}} \right) = \frac{4}{\left( 1 + \sqrt {1 + \alpha^{2}{tan}^{2}\theta_{1}} \right)~\left( 1 + \sqrt{1 + \alpha^{2}{tan}^{2}\ theta_{2}}\right)},cos\theta_{1} = ({\mathbf{i}}\cdot{\mathbf{n}}),cos\theta_{2} = ({\mathbf{o}} } \cdot {\mathbf{n}})G(i,o,m)=(1+1+a2 tan2 i1 ) (1+1+a2 tan2 i2 )4,cosθ1=(in),cosθ2=(on )
Here we can also see that we have four parameterskdiff k_{diff}kdiff k s p e c k_{spec} kspec F 0 F_{0} F0α \alphaα is used to control the specific performance of the material. kdiff k_{diff}kdiff k s p e c k_{spec} kspecUsed to control the intensity of diffuse reflection and specular reflection, F 0 F_{0}F0Used to control the Fresnel effect, there is a clear distinction between metals and non-metals, α \alphaα describes the roughness of the object.

  The implementation code is attached, and redundant content is deleted, which is relatively intuitive and easy to understand. Note that only the illumination of the main light source is considered here, F 0 F_{0}F0It is a single channel, and if you need to cast shadows and receive shadows, you need to add additional code for processing (another project some time ago also implemented the GGX model in the URP pipeline, and I will also attach the corresponding implementation in that article code):

Shader "BRDF/GGX"
{
    
    
    Properties
    {
    
    
        _Kdiff ("Kdiff", float) = 0.1
		_Kspec ("Kspec", float) = 0.4
		_Alpha ("Alpha", float) = 0.05
		_F0 ("F0", float) = 0.8
    }
    SubShader
    {
    
    
        Tags {
    
     "RenderType"="Opaque" }
        LOD 100

        Pass
        {
    
    
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag

            #include "Lighting.cginc"

			#define PI 3.1415926535

            struct appdata
            {
    
    
                float4 vertex : POSITION;
                float3 normal : NORMAL;
            };

            struct v2f
            {
    
    
                float4 vertex : SV_POSITION;
				float3 posW : TEXCOORD0;
				float3 normW : TEXCOORD1;
            };

            float _Kdiff;
			float _Kspec;
			float _Alpha;
			float _F0;

            v2f vert (appdata v)
            {
    
    
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
				o.posW = mul(unity_ObjectToWorld, v.vertex);
				o.normW = mul(v.normal, (float3x3)unity_WorldToObject);
                return o;
            }

            fixed4 frag (v2f i) : SV_Target
            {
    
    
				float3 V = normalize(_WorldSpaceCameraPos - i.posW);
				float3 L = normalize(_WorldSpaceLightPos0.xyz);
				float3 H = normalize(L + V);
				float3 N = normalize(i.normW);

				float NdotH = dot(N, H);
				float NdotL = dot(N, L);
				float NdotV = dot(N, V);
				float HdotV = dot(H, V);

				float alpha2 = _Alpha * _Alpha;
				float tanNL2 = 1 / (NdotL * NdotL + 1e-8) - 1;
				float tanNV2 = 1 / (NdotV * NdotV + 1e-8) - 1;

				float F = _F0 + (1 - _F0) * pow(1 - HdotV, 5);
				float D = alpha2 / (PI * pow(NdotH * NdotH * (alpha2 - 1) + 1, 2));
				float G = 4 / ((1 + sqrt(1 + alpha2 * tanNL2)) * (1 + sqrt(1 + alpha2 * tanNV2)));
				float fr = _Kdiff / PI +
						   _Kspec * F * D * G / (4 * NdotL * NdotV);

                return fixed4(max(0, fr) * _LightColor0.rgb * max(0, NdotL), 1.0f);
            }
            ENDCG
        }
    }
}

Color Space Color Space

  If you want to get relatively correct and real results, Unity's Color Space (Project Settings->Player->Other Settings) should be set to Linear, which is related to the nonlinear output of screen brightness.

  If set to Gamma, Unity will not remove the gamma correction of sRGB, nor will it perform gamma correction in the final output result, which will cause some deviations in the final result, but will reduce overhead, so Unity chooses it by default Gamma Space (it seems that the new version has been changed to use Linear Space by default?). After all, many times the graphics seem to be ok, so it’s ok. Sometimes the process is too much, and the last thing is performance.

HDR and Tone mapping

  Because the highlight performance needs to be observed in the project, HDR is turned on and Tone mapping is added to preserve more highlight details.

  Corresponding to HDR (High Dynamic Range, high dynamic range) is LDR (Low Dynamic Range, low dynamic range). Simply put, the part of LDR color greater than 1 is limited to 1, while the color of HDR can exceed 1 ( For example, IBL is realized by HDR). However, the display range of the display is still 0 to 1, so in order to display the part exceeding 1, tone mapping is required to map high brightness to low brightness and retain more details.

  The implementation is to directly use the ACES Tone mapping code found on the Internet :

	float3 ACES(float3 color)
	{
    
    
		const float a = 2.51f;
		const float b = 0.03f;
		const float c = 2.43f;
		const float d = 0.59f;
		const float e = 0.14f;
	
		return (color * (a * color + b)) / (color * (c * color + d) + e);
	}

(Theoretically, it should belong to the post-processing process, and the exposure value can be adjusted manually or dynamically according to the rendering screen, but the project does not need it, so I wrote it directly in the BRDF shader)

Je suppose que tu aimes

Origine blog.csdn.net/qq_43459138/article/details/129238890
conseillé
Classement