Unity in the interactive water production process

Detailed interactive water effects

Use shaders to make a wide variety of water technology, production techniques and tools in order to achieve various effects are also very high. Here are a commonly used production process of the water, including the effect of reflection, refraction, undulating, waterfront interactions.
Here Insert Picture Description

The principle:

1. The normal grain animation superimposed in different directions and intensity to simulate flow.
2. The normal value after the texture using the texture-mapping the offset screen capture (the RenderTexture) of UV, simulate the effect of refraction of water.
3. The use of an analog peak shifting water waves.
4. Use scene depth map and the depth (depth of the screen) to simulate the difference between the water and obtain the object mask at the intersection.
5. The template of the PBR Specular case with ASE version, and so use Smoothness Specular light to simulate the high roughness. And the code used is a custom version of the light, so that high optical computing Bulinn-Phong do.
**

Algorithm:

1. Animation normal texture superimposed
here only two stacked layer normal, if multiple stacked layers can pursue effect.
Here Insert Picture Description
Here Insert Picture DescriptionThis type of node is Texture Object, maps can be generated instance.
BlendNormals node'm mixing two normal maps
formula: BlendNormal = normalize (float3 (A.xy + B.xy, Az * Bz).
Interior where a normal vector has been transformed from the tangent space to world space.

Code has been transferred tangent space world space transformation, but superimposed on the normal algorithm directly with the addition and normalization, and the normal hybrid algorithm BlendNormal ASE different, the effect is almost the same:

/**************计算法线纹理UV偏移,并将法线纹理反映射,得到切线空间下的法向量,最终将法向量转换到世界空间下**************/
float2 bumpUVOffset0 = i.uv.xy + _Time.y * _Speed0; //其中一个方向的法线UV偏移
float2 bumpUVOffset1 = i.uv.xy + _Time.y * _Speed1; //另一个方向的法线UV偏移
float3 bump0 = UnpackNormal(tex2D(_NormalMap, bumpUVOffset0)); //将一个方向的法线反映射
float3 bump1 = UnpackNormal(tex2D(_NormalMap, bumpUVOffset1)); //将另一个方向的法线反映射
float3 bump = normalize(bump0 + bump1); //将两个法线相加,并归一化
bump.xy *= _BumpScale; //用参数控制法线强度
bump.z = sqrt(1 - dot(bump.xy, bump.xy)); //得到法线Z值
float3 worldBump = normalize(float3(dot(i.TtoW0.xyz, bump), dot(i.TtoW1.xyz, bump), dot(i.TtoW2.xyz, bump))); //将法线从切线空间转到世界空间

2. refractive algorithm
using the value obtained by normal texture UV demapping offset screenshots texture (the RenderTexture) to simulate the effect of refraction of water.
Here Insert Picture Description
Mixing the input vector is previously obtained, GetLocalVar screen is the screen coordinate XY coordinates do correctly homogeneous after division, where the use of the normal vector to the offset X and Z coordinates of the screen, the screen coordinates to obtain offset coordinates with screen capture texture sampling, the finally obtained texture distorted normal screen.
To get the effect of refraction also set a very important place, that is, mixed mode and render queue, we must render mode to Opaque, Render Queue is set to Transparent. These two settings appear to conflict, but in fact not, because it is a screen capture effect, does not involve a semi-permeable surface of the water, so the rendering mode is set to Opaque. The Transparent to ensure that when rendering the water, all opaque objects have been drawn to the screen.
Here Insert Picture Description
Code corresponding to:

/***********************************折射算法********************************/
float2 scrposOffset = worldBump.xz * _RefractionIndentity * i.scrPos.xy; //利用法线影响屏幕坐标,最终得到屏幕坐标的偏移值
float2 finalScrpos = (i.scrPos.xy + 0) / i.scrPos.w; //经过齐次除法计算最终的屏幕坐标
float3 refrCol = tex2D(_RefrectionTex, finalScrpos).rgb; //用屏幕坐标采样抓屏纹理得到模拟的折射效果

Pass recall Affirming screenshots used, if the curly braces is empty, the default _MainTex as RenderTexture.

GrabPass{"_RefrectionTex"} //抓屏Pass

3. waterfront interaction algorithm
this part of the algorithm mainly three steps, the first is to get the junction waterfront mask, and then make a foam texture UV animation, animation and final foam texture mask overlay, to get the final result.
(1) Waterfront interaction mask
Here Insert Picture Description
see below, we can get the water model and masking junction (principles here are interested can see my DepthFade Detailed article).
Here Insert Picture Description
Code is shown below, the basic logic is linearly first depth map, i.e. the resulting depth of the camera space, and then using the obtained division scrPos made homogeneous NDC in depth, since the depth of the two to be operated under a unified coordinate system Therefore the depth of the NDC camera space to continue the (linearized), and finally subtracting the depth of the two, to obtain the mask at the junction.

/*****************************水岸交互部分检测算法****************************/
float depthTexDepth = LinearEyeDepth(SAMPLE_DEPTH_TEXTURE_PROJ(_CameraDepthTexture, UNITY_PROJ_COORD(i.scrPos))); //利用屏幕坐标采样深度图,并对深度解码得到深度图中的线性深度(摄像机空间)
float screenDepth = i.scrPos.z / i.scrPos.w; //经过齐次除法,得到屏幕深度值
screenDepth = (UNITY_NEAR_CLIP_VALUE >= 0) ? screenDepth : screenDepth * 0.5 + 0.5; //根据平台不同,修正屏幕深度值
screenDepth = LinearEyeDepth(screenDepth); //得到线性的屏幕深度
float distanceDepth = abs(depthTexDepth - screenDepth) / _DepthDistance; //得到深度差值,并用参数控制大小

Note: You must declare before using the depth map

UNITY_DECLARE_DEPTH_TEXTURE(_CameraDepthTexture);

4. The analog peak shifting water waves
basic idea is to make the vertex of model animation cycle up and down in the Y-axis direction, the overall look is a sin function images. So we need to introduce time variables and functions sin, sin do with the time for vertex animation cycle.
Here Insert Picture Description
Code:

/************************************水波纹的顶点偏移动画************************************/
float vertOffset_y = (v.vertex.x + v.vertex.y + v.vertex.z) * _WaveFrency + _Time.y * _WaveSpeed; //计算波浪起伏的频次和速度
vertOffset_y = sin(vertOffset_y) * _WavePower; //对上面计算做sin波循环
float3 vertOffset = float3(0, vertOffset_y, 0); //得到y轴上的偏移量
o.pos = UnityObjectToClipPos(v.vertex + vertOffset); //将偏移后的顶点转换到裁剪空间

That effect is obtained by mixing the beginning Benpian interaction of water, but in fact there are many effects can be achieved, for example, the visibility underwater and color layering, caustics underwater, and the water bubbles randomly and so on. Welcome to leave a message in the comments area, more exchanges.

Released five original articles · won praise 15 · views 3225

Guess you like

Origin blog.csdn.net/qq_42650859/article/details/104659796