Objects stroke effects achieved using convolution three.js

Extension of the normal law

Online using the normal method to extend the article stroke effect to achieve the object more, it will not be described here.

However, this method has a drawback: when the normal angle difference between the two larger faces, two faces of the stroke can not be perfectly connected. As shown below:

 

 

 Convolution

Another method used herein deconvolution method to achieve the object stroke effect, the method used in general machine learning more. Look at the renderings:

    

 

 

Three.js using specific method is as follows:

  1. Creating shaders materials, hide unwanted strokes render objects, will be required stroke position rendered white, elsewhere rendered as black.
  2. Sources using shader convolution calculation, internal white object, the external object is black, a gray border.
  3. Set the material transparent, non-fusion, border superimposed on the original image, you can use FXAA anti-aliasing.

These three steps can be achieved, it is very simple. Below we will detail the implementation method, do not want to see you can go directly to the full realization of the code:

Complete code:  https://gitee.com/tengge1/ShadowEditor/blob/master/ShadowEditor.Web/src/helper/SelectHelper.js

Detailed implementation process:

1. three.js normal scene is drawn, to give the following figure, it is not described here.

 

 

2. Create the shader material, hide all unnecessary objects strokes. The need to stroke objects drawn in white, other parts drawn in black.

After the stroke does not need to hide the object, replaces the entire scene material.

renderScene.overrideMaterial = this.maskMaterial;

Shader Material:

const maskMaterial = new THREE.ShaderMaterial({
    vertexShader: MaskVertex,
    fragmentShader: MaskFragment,
    depthTest: false
});
MaskVertex:
void main() {
    gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
MaskFragment:
void main() {
    gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}

Renderings:

 

 

 

3. Create a material shader subjected to convolutional calculation, averaging every four pixels to obtain a color pixel. Stroke internal objects are white, black outside, at the edge of the object will be gray. Gray is what we needed for the border.

const edgeMaterial = new THREE.ShaderMaterial({
    vertexShader: EdgeVertex,
    fragmentShader: EdgeFragment,
    uniforms: {
        maskTexture: {
            value: this.maskBuffer.texture
        },
        texSize: {
            value: new THREE.Vector2(width, height)
        },
        color: {
            value: selectedColor
        },
        thickness: {
            type: 'f',
            value: 4
        },
        transparent: true
    },
    depthTest: false
});
Wherein the convolution calculation is texSize canvas width and height, in order to allow smoother frame may be set to be twice the original canvas. color is the border color, thickness is the thickness of the border.
Note that, to the material transparent to true.
EdgeVertex:
varying vec2 vUv;

void main() {
    vUv = uv;
    gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}
EdgeFragment:
uniform sampler2D maskTexture;
uniform vec2 texSize;
uniform vec3 color;
uniform float thickness;

varying vec2 vUv;

void main() {
    vec2 invSize = thickness / texSize;
    vec4 uvOffset = vec4(1.0, 0.0, 0.0, 1.0) * vec4(invSize, invSize);

    vec4 c1 = texture2D( maskTexture, vUv + uvOffset.xy);
    vec4 c2 = texture2D( maskTexture, vUv - uvOffset.xy);
    vec4 c3 = texture2D( maskTexture, vUv + uvOffset.yw);
    vec4 c4 = texture2D( maskTexture, vUv - uvOffset.yw);
    
    float diff1 = (c1.r - c2.r)*0.5;
    float diff2 = (c3.r - c4.r)*0.5;
    
    float d = length(vec2(diff1, diff2));
    gl_FragColor = d > 0.0 ? vec4(color, 1.0) : vec4(0.0, 0.0, 0.0, 0.0);
}

Renderings:

4. Create the shader material, the frame superimposed on the original image. Since FXAA complex, where the use of a simple stacking method.

Shader Material:

const copyMaterial = new THREE.ShaderMaterial({
    vertexShader: CopyVertexShader,
    fragmentShader: CopyFragmentShader,
    uniforms: {
        tDiffuse: {
            value: edgeBuffer.texture
        },
        resolution: {
            value: new THREE.Vector2(1 / width, 1 / height)
        }
    },
    transparent: true,
    depthTest: false
});

Note, transparent to be set to true, otherwise it will overwrite the original picture.

CopyVertexShader:

varying vec2 vUv;

void main() {
    vUv = uv;
    gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );
}

CopyFragmentShader:

uniform float opacity;

uniform sampler2D tDiffuse;

varying vec2 vUv;

void main() {
    vec4 texel = texture2D( tDiffuse, vUv );
    gl_FragColor = opacity * texel;
}

To obtain the final image:

 

References:

1. Stroke achieve complete code: https://gitee.com/tengge1/ShadowEditor/blob/master/ShadowEditor.Web/src/helper/SelectHelper.js

2. three.js post-stroke treatment: https://threejs.org/examples/#webgl_postprocessing_outline

3. Convolution works: https://www.zhihu.com/question/39022858?sort=created

4. Method to achieve the object stroke extending normal: https://blog.csdn.net/srk19960903/article/details/73863853

 

Guess you like

Origin www.cnblogs.com/tengge/p/11924006.html