OpenGL.Shader: 7-Learning Light-Normal Vector

OpenGL.Shader: 7-Learning Light-Normal Vector

 

Lighting occupies a very important part of OpenGL. The simulation of lighting has become a major research topic in the computer field. We can see the impact of this field, which is not only reflected in the gradually improved game vision, but also reflected in the fields of film, computer imaging (CGI) and other fields.

Usually according to the type, different light sources can be divided into the following groups:

Ambient light Ambient light
seems to come from all directions, and everything in the scene is illuminated to the same degree. This is similar to the light we get from a large, equal light source, such as the sky. Ambient light can also be used to fictionalize the effect of bounce off many objects before the light reaches our eyes. Shadows are never painted through ambient light. Into black.

Directional light
seems to come from one direction, and the light source seems to be in a very diverse place. This is similar to the light we get from the sun or the moon.

Point light
appears to be light projected from somewhere nearby, and the density of light decreases with distance. This is suitable for representing nearby light sources, which project their light in all directions, like a light bulb or candle.

Spot light is
similar to spot light, but with a restriction, it can only project in a specific direction. It's like the type of light we get from a flashlight or spotlight.

It can also be divided into two categories according to the way the light is emitted on the surface of the object:

Diffuse reflection
refers to the spread of light equally in all directions. It is suitable for materials that have no polished surfaces, such as carpets or external mixed walls. These similar surfaces seem to have many different observation points.

Specular reflection Specular reflection
emits more strongly in a certain direction, and is suitable for polished or shiny materials, such as smooth metal or a car that has just been waxed. 2

 

 

These light sources are not directly simulated in OpenGL. Instead, most games and applications simplify things and approximate the workings of light at a high level, rather than directly simulating it. This will introduce another important concept-the normal vector .

Imagine: if a flat surface is facing the light, it is naturally the brightest. Of course, there are some material planes that can reflect light, and the intensity of the reflected light is related to the angle you observe, but this article will not introduce them. We use normal vectors to indicate the plane orientation. In the specific implementation, each point will have a normal vector. The so-called normal vector is a three-dimensional vector perpendicular to the plane, as shown in the figure below.

The figure shows two normal vector representation methods. On the left is a normal vector for each point of each polygon, and on the right is a normal vector for each point. The normal vector of the shared point is this point in all The sum of the normal vectors on the plane. The normal vector should always be normalized to a unit vector. In the examples in this article, the method on the left is used.

Here we are based on the cube that has been used all the time, adding a normal vector to the original cube (position, texture) data.


class CubeIlluminate {
public:
    struct V3N3T2 {
        float x, y, z; //位置坐标
        float nx, ny, nz; //法向量
        float u,v; //纹理坐标
    };
public:
    GLuint                  mCubeSurfaceTexId;
    V3N3T2                  _data[36];

    void        init(const CELL::float3 &halfSize, GLuint tex)
    {
        V3N3 verts[] =
        {
                {+halfSize.x, -halfSize.y, +halfSize.z, 0.0f,  -1.0f, 0.0f,  0.0f,0.0f},
                {-halfSize.x, -halfSize.y, +halfSize.z, 0.0f,  -1.0f, 0.0f,  1.0f,0.0f},
                {-halfSize.x, -halfSize.y, -halfSize.z, 0.0f,  -1.0f, 0.0f,  1.0f,1.0f},

                {-halfSize.x, -halfSize.y, +halfSize.z, 0.0f,  0.0f,  +1.0f, 0.0f,0.0f},
                {+halfSize.x, -halfSize.y, +halfSize.z, 0.0f,  0.0f,  +1.0f, 1.0f,1.0f},
                {+halfSize.x, +halfSize.y, +halfSize.z, 0.0f,  0.0f,  +1.0f, 0.0f,1.0f},

                {+halfSize.x, +halfSize.y, -halfSize.z, +1.0f, 0.0f,  0.0f,  1.0f,0.0f},
                {+halfSize.x, +halfSize.y, +halfSize.z, +1.0f, 0.0f,  0.0f,  1.0f,1.0f},
                {+halfSize.x, -halfSize.y, +halfSize.z, +1.0f, 0.0f,  0.0f,  0.0f,1.0f},

                {-halfSize.x, +halfSize.y, +halfSize.z, 0.0f,  +1.0f, 0.0f,  1.0f,0.0f},
                {+halfSize.x, +halfSize.y, +halfSize.z, 0.0f,  +1.0f, 0.0f,  0.0f,1.0f},
                {+halfSize.x, +halfSize.y, -halfSize.z, 0.0f,  +1.0f, 0.0f,  0.0f,0.0f},

                {-halfSize.x, +halfSize.y, +halfSize.z, -1.0f, 0.0f,  0.0f,  0.0f,1.0f},
                {-halfSize.x, -halfSize.y, -halfSize.z, -1.0f, 0.0f,  0.0f,  0.0f,0.0f},
                {-halfSize.x, -halfSize.y, +halfSize.z, -1.0f, 0.0f,  0.0f,  1.0f,0.0f},

                {-halfSize.x, +halfSize.y, +halfSize.z, 0.0f,  0.0f,  +1.0f, 0.0f,1.0f},
                {-halfSize.x, -halfSize.y, +halfSize.z, 0.0f,  0.0f,  +1.0f, 1.0f,0.0f},
                {+halfSize.x, +halfSize.y, +halfSize.z, 0.0f,  0.0f,  +1.0f, 1.0f,1.0f},

                {+halfSize.x, -halfSize.y, -halfSize.z, 0.0f,  -1.0f, 0.0f,  1.0f,1.0f},
                {+halfSize.x, -halfSize.y, +halfSize.z, 0.0f,  -1.0f, 0.0f,  0.0f,1.0f},
                {-halfSize.x, -halfSize.y, -halfSize.z, 0.0f,  -1.0f, 0.0f,  0.0f,0.0f},

                {+halfSize.x, -halfSize.y, -halfSize.z, 0.0f,  0.0f,  -1.0f, 1.0f,1.0f},
                {-halfSize.x, -halfSize.y, -halfSize.z, 0.0f,  0.0f,  -1.0f, 0.0f,0.0f},
                {+halfSize.x, +halfSize.y, -halfSize.z, 0.0f,  0.0f,  -1.0f, 1.0f,0.0f},

                {+halfSize.x, -halfSize.y, -halfSize.z, +1.0f, 0.0f,  0.0f,  1.0f,0.0f},
                {+halfSize.x, +halfSize.y, -halfSize.z, +1.0f, 0.0f,  0.0f,  1.0f,1.0f},
                {+halfSize.x, -halfSize.y, +halfSize.z, +1.0f, 0.0f,  0.0f,  0.0f,1.0f},

                {-halfSize.x, +halfSize.y, -halfSize.z, 0.0f,  0.0f,  -1.0f, 1.0f,0.0f},
                {+halfSize.x, +halfSize.y, -halfSize.z, 0.0f,  0.0f,  -1.0f, 0.0f,1.0f},
                {-halfSize.x, -halfSize.y, -halfSize.z, 0.0f,  0.0f,  -1.0f, 0.0f,0.0f},

                {-halfSize.x, +halfSize.y, -halfSize.z, -1.0f, 0.0f,  0.0f,  0.0f,0.0f},
                {-halfSize.x, -halfSize.y, -halfSize.z, -1.0f, 0.0f,  0.0f,  1.0f,0.0f},
                {-halfSize.x, +halfSize.y, +halfSize.z, -1.0f, 0.0f,  0.0f,  1.0f,1.0f},

                {-halfSize.x, +halfSize.y, -halfSize.z, 0.0f,  +1.0f, 0.0f,  0.0f,0.0f},
                {-halfSize.x, +halfSize.y, +halfSize.z, 0.0f,  +1.0f, 0.0f,  1.0f,1.0f},
                {+halfSize.x, +halfSize.y, -halfSize.z, 0.0f,  +1.0f, 0.0f,  0.0f,1.0f},
        };
        memcpy(_data, verts, sizeof(verts));

        mCubeSurfaceTexId = tex;
    }

    // ... ...
};

In view of the large amount of data, the custom structure V3N3T2 represents the amount of data of one point, and then every three groups of V3N3T2 form a triangular surface, and the rest is no longer wordy. It should be noted that the normal vector only represents the direction, not the position information, so the normal vector should try to use normalized data. The above data is projected to Cartesian space as shown in the figure

First, take the point (1, -1, 1) of the four points on the left side as an example, the red one is the normal vector (1, 0, 0) corresponding to the left side horizontally to the right;
replaced by the bottom surface, the same point (1, -1, 1) as an example, the yellow one is the normal vector (0, -1, 0) corresponding to the bottom face down vertically;
then replace it with the front face, which is the same point (1, -1, 1), blue The color is the normal vector (0, 0, 1) corresponding to the front face, and the horizontal plane faces outward;

The above description once again certifies theoretical knowledge: 1) The normal vector represents the direction, not the position; 2) The normal vector is a normalized vector, namely Math.sqrt(x*x + y*y + z*z)=1; 3 ) The normal vector is determined based on the surface, and the direction is perpendicular to the plane. There can be multiple normal vectors for the common points of multiple surfaces.

 

The normal vector is basically introduced here. Let’s go straight to the shader program group.

class CubeIlluminateProgram : public ShaderProgram
{
public:
    GLint       _mvp;
    GLint       _lightDir;
    GLint       _lightColor;
    GLint       _lightDiffuse;
    GLint       _texture;
    GLint       _position;
    GLint       _normal;
    GLint       _uv;
public:
    virtual void    initialize()
    {
        const char* vs  =  "#version 320 es\n\
                            uniform mat4   _mvp;\n\
                            uniform vec3   _lightDir;\n\
                            uniform vec3   _lightColor;\n\
                            uniform vec3   _lightDiffuse;\n\
                            in      vec3   _position;\n\
                            in      vec3   _normal;\n\
                            in      vec2   _uv;\n\
                            out     vec2   _outUV;\n\
                            out     vec4   _outComposeColor;\n\
                            void main()\n\
                            {\n\
                                _outUV                =   _uv; \n\
                                float lightStrength   =   max(dot(_normal, -_lightDir), 0.0); \n\
                                _outComposeColor =   vec4(_lightColor * lightStrength + _lightDiffuse, 1);\n\
                                gl_Position      =   _mvp * vec4(_position,1.0);\n\
                            }";

        const char* fs =   "#version 320 es\n\
                            precision mediump float;\n\
                            in      vec4        _outComposeColor;\n\
                            in      vec2        _outUV;\n\
                            uniform sampler2D   _texture;\n\
                            out     vec4        _fragColor;\n\
                            void main()\n\
                            {\n\
                                vec4    color   =   texture(_texture,_outUV);\n\
                                _fragColor      =   color * _outComposeColor;\n\
                            }";

        programId   =   ShaderHelper::buildProgram(vs, fs);
        _mvp        =   glGetUniformLocation(programId,  "_mvp");
        _lightDir   =   glGetUniformLocation(programId,  "_lightDir");
        _lightColor =   glGetUniformLocation(programId,  "_lightColor");
        _lightDiffuse = glGetUniformLocation(programId,  "_lightDiffuse");

        _position   =   glGetAttribLocation(programId,   "_position");
        _normal     =   glGetAttribLocation(programId,   "_normal");
        _uv         =   glGetAttribLocation(programId,   "_uv");

        _texture    =   glGetUniformLocation(programId,  "_texture");
    }

    virtual void    begin()
    {
        glEnableVertexAttribArray(_position);
        glEnableVertexAttribArray(_normal);
        glEnableVertexAttribArray(_uv);
        glUseProgram(programId);
    }
    virtual void    end()
    {
        glDisableVertexAttribArray(_position);
        glDisableVertexAttribArray(_normal);
        glDisableVertexAttribArray(_uv);
        glUseProgram(0);
    }
};

First analyze the vertex shader program:

#version 320 es
uniform mat4   _mvp; // 模型视图投影矩阵
uniform vec3   _lightDir; // 光源方向 只是一个方向
uniform vec3   _lightColor; // 环境光源颜色
uniform vec3   _lightDiffuse; // 漫反射 模拟材质补光用
in      vec3   _position; // 顶点位置属性
in      vec3   _normal; // 顶点法向量
in      vec2   _uv; // 纹理坐标
out     vec2   _outUV; // 输出片元着色器纹理坐标
out     vec4   _outComposeColor; // 输出的混合光
void main()
{
     _outUV                =   _uv;
     float lightStrength   =   max(dot(_normal, -_lightDir), 0.0);
     _outComposeColor =   vec4(_lightColor * lightStrength + _lightDiffuse, 1);
     gl_Position      =   _mvp * vec4(_position,1.0);
}

The second line of code  light intensity = normal vector * light source vector after reverse normalization  . From the mathematical formula, the light intensity is the dot product of the normal vector and the inversely normalized light source vector. What is the light source after reverse naturalization? Look at the picture below, so easy.

There are also some web tutorials that directly reverse the input normal vector, which is consistent from the point of view of mathematical operations, but this is not easy to understand theoretically, and I do not recommend this way. It's fine to know that this is the case, as long as you can understand other people's actions.

After getting the light intensity, the third line of code vec4 (_lightColor * lightStrength + _lightDiffuse, 1), light intensity * ambient light color value, at this time actually has the lighting effect, plus diffuse reflection _lightDiffuse is to prevent the place without light intensity It turns black completely, as it does not match the actual situation. There are many contents of diffuse reflection that can be expanded, such as calculating the light intensity of diffuse reflection according to the texture of the material, adding your own color value and so on.

version 320 es
precision mediump float;
in      vec4        _outComposeColor;
in      vec2        _outUV;
uniform sampler2D   _texture;
out     vec4        _fragColor;
void main()
{
     vec4    color   =   texture(_texture,_outUV);
     _fragColor      =   color * _outComposeColor;
}

Then come to the fragment shader program, the shader program is relatively simple, extract the texture color value according to the texture coordinates, and then multiply the mixed light color output from the vertex based on the texture color value, and you are done.

 

Finally, combined with the shader program, complement the rendering method of CubeIlluminate.

void        render(Camera3D& camera)
{
    sprogram.begin();
    CELL::matrix4   matModel(1);
    CELL::matrix4   vp = camera.getProject() * camera.getView();
    CELL::matrix4   mvp = (vp * matModel);
    glUniformMatrix4fv(sprogram._mvp, 1, GL_FALSE, mvp.data());
    glActiveTexture(GL_TEXTURE0);
    glEnable(GL_TEXTURE_2D);
    glBindTexture(GL_TEXTURE_2D,  mCubeSurfaceTexId);
    glUniform1i(sprogram._texture, 0);
    glUniform3f(sprogram._lightDiffuse, 0.1f, 0.1f, 0.1f); // 漫反射 环境光
    glUniform3f(sprogram._lightColor, 1.0f, 1.0f, 1.0f); // 定向光源的颜色
    glUniform3f(sprogram._lightDir, // 定向光源的方向   直接使用摄像头到观察点的方向
                static_cast<GLfloat>(camera._dir.x),
                static_cast<GLfloat>(camera._dir.y),
                static_cast<GLfloat>(camera._dir.z));
    glVertexAttribPointer(static_cast<GLuint>(sprogram._position), 3, GL_FLOAT, GL_FALSE,
                          sizeof(CubeIlluminate::V3N3), &_data[0].x);
    glVertexAttribPointer(static_cast<GLuint>(sprogram._normal),   3, GL_FLOAT, GL_FALSE,
                          sizeof(CubeIlluminate::V3N3), &_data[0].nx);
    glVertexAttribPointer(static_cast<GLuint>(sprogram._uv),       2, GL_FLOAT, GL_FALSE,
                          sizeof(CubeIlluminate::V3N3), &_data[0].u);
    glDrawArrays(GL_TRIANGLES, 0, 36);
    sprogram.end();
}

 

Demo project link https://github.com/MrZhaozhirong/NativeCppApp ->LightRenderer.cpp CubeIlluminate.hpp CubeIlluminateProgram.hpp 

Finally, here is the simplest light normal effect:

Guess you like

Origin blog.csdn.net/a360940265a/article/details/90679025