Heavy beauty of GPUImage source code analysis technology

When it comes to image processing and real-time filters based on the GPU, I am sure you will think of the famous GPUImage, this project does provide a lot of convenience for the subsequent development, basic image processing tools readily available. But learn from the project structure GPUImage can provide big help for us.

GPUImage project structure

GPUImage project structure is very simple, Android version is even more simple, structured as follows:

  • A stack filter (code supporting and setting parameters shader)
  • FilterGroup (FBO using multiple treatments of the same sub-image)
  • EGL Management (mainly used for off-screen rendering)

Although the main value GPUImage on a pile of filters, but we mainly analyze the latter two, which is the framework GPUImage, and the like plug-in filters, like, want to plug to plug: D, we can also customize Yihuhuhuapiao own filters.

Why offscreen rendering

The main purpose of off-screen rendering is processing data in the background, did the Camera application knows that if SurfaceView preview, then you have to put the camera data is displayed, in order not to show, it is imperative SurfaceView shrink to a very small, trouble and a waste of resources. Android 3.0 has SurfaceTexture and GLSurfaceView, after there has been TextureView, free to handle the camera data does not show up, but still has a display and rendering process. In other words, TextureView GLSurfaceView and obedient enough, can not complete all of our requirements.

If we just want to take advantage of GPU processing a picture, but he does not show up then?

For chestnuts
we look Camera360 Lite version of the interface:

These pictures are open later to select a filter you can see, without networking can, APK they are carrying? Why are all the same person? However, after looking around, I can only find them in the APK:

big sister in different colors go?
This suggests that these different filter effects, in fact, is the APK in the first run, on a user-generated phone. (Free to view the data folder Camera360) so it has many advantages, for example, greatly reduces the APK said volume, the same set of codes may also be used to perform different functions. Of course, this is only one advantage of off-screen rendering.

It had been with GLSurfaceView, GLSurfaceView help us to complete the environment configuration EGL, and now do not use GLSurfaceView, we have to manage on their own, to see GPUImage is how to do it:

GPUImage reference GLSurfaceView, carried out its own environment configuration of OpenGL (like I did not say, ah, to escape ...

GLSurfaceView behind our analysis of the code, then it will be off-screen rendering of how to do it (after all, what are the environmental configuration routine)

Filter frame buffer group objects (the FBO)

GPUImage group filters can be said that these filters are the best reuse. By means FrameBufferObject (FBO, frame buffer), we can get the desired results using a combination of different filters on an image.

As another chestnut:
I wrote a gray filter, you can turn the picture into black and white, as follows:

precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D sTexture;
void main() {
    vec3 centralColor = texture2D(sTexture, vTextureCoord).rgb;
    gl_FragColor = vec4(0.299*centralColor.r+0.587*centralColor.g+0.114*centralColor.b);
}

One day I was busy with nothing to do, wrote an anti-color filter:

precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D sTexture;
void main() {
    vec4 centralColor = texture2D(sTexture, vTextureCoord);
    gl_FragColor = vec4((1.0 - centralColor.rgb), centralColor.w);
}

Now Boss asked me to black and white video stream to be processed, and then reverse color.
How this small rare to me, and then I spent 10 minutes to write the following code:

precision mediump float;
varying vec2 vTextureCoord;
uniform sampler2D sTexture;
void main() {
    vec4 centralColor = texture2D(sTexture, vTextureCoord);
    gl_FragColor =vec4(0.299*centralColor.r+0.587*centralColor.g+0.114*centralColor.b);
    gl_FragColor = vec4((1.0 - gl_FragColor.rgb), gl_FragColor.w);
}

These two filters is relatively simple (only one line), if each filter is very complicated? If the combination of a lot of it?

We wrote two functions with a filter inside, so every time to modify shader, that is not elegant, not at all reflect the OO University teachers worked hard to instill the idea.

In GPUImage, the frame buffer objects is used to solve this problem, we are all one-time processed before it is drawn to the screen, and now do not, we can save the results in the frame buffer of them, then let's draw as a result of lower once the input data to be processed, so my code becomes:

filterGroup.addFilter(new GrayScaleShaderFilter(context));
filterGroup.addFilter(new InvertColorFilter(context));

If you have a third processing step how to do?
Then a new way! Is not it convenient?

FBO creation and rendering process

First, we need two arrays, used to store texture ID ID and graph the results of the FBO.

protected int[] frameBuffers = null;
protected int[] frameBufferTextures = null;

Yes, FBO, like texture, as represented by a number.

if (frameBuffers == null) {
    frameBuffers = new int[size-1];
    frameBufferTextures = new int[size-1];

    for (int i = 0; i < size-1; i++) {
        GLES20.glGenFramebuffers(1, frameBuffers, i);

        GLES20.glGenTextures(1, frameBufferTextures, i);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, frameBufferTextures[i]);
        GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA,
                filters.get(i).surfaceWidth, filters.get(i).surfaceHeight, 0,
                GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);

        GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frameBuffers[i]);
        GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0,
                GLES20.GL_TEXTURE_2D, frameBufferTextures[i], 0);

        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
        GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
    }
}

The code here longer, but we previously generated and texture are very similar (not OpenGL ES-based students can see this)

  • To generate a frame buffer object GLES20.glGenFramebuffers
  • The following is actually a large segment generated a texture and with our current long and wide to draw to configure, and specify the handling of the border, zoom in strategy
  • The key here: GLES20.glFramebufferTexture2D we used to associate a texture image into a frame buffer objects, this tells OpenGL FBO is used to associate a 2D texture, frameBufferTextures [i] and is associated with the FBO texture
  • Why is size-1, it is because we finally draw to a texture directly on the screen ah ~

draw

FBO generated later, so we can rewrite our drawing code

if (i < size - 1) {
    GLES20.glViewport(0, 0, filter.surfaceWidth, filter.surfaceHeight);
    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frameBuffers[i]);
    GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
    filter.onDrawFrame(previousTexture);
    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
    previousTexture = frameBufferTextures[i];
}else{
    GLES20.glViewport(0, 0 ,filter.surfaceWidth, filter.surfaceHeight);
    filter.onDrawFrame(previousTexture);
}
  • Prior to each use to draw glBindFramebuffer binding FBO, then we draw the results will not be displayed on the screen, but became a just and FBO texture object binding, then use this texture to the next filter as input
  • Enter the first filter is our camera or player corresponding texture
  • Finally, a filter does not need to output to the FBO, thus directly drawn like.

Filter complete set of code

package com.martin.ads.omoshiroilib.filter.base;

import android.opengl.GLES20;
import android.util.Log;

import java.util.ArrayList;
import java.util.List;

/**
 * Created by Ads on 2016/11/19.
 */

public class FilterGroup extends AbsFilter {
    private static final String TAG = "FilterGroup";
    protected int[] frameBuffers = null;
    protected int[] frameBufferTextures = null;
    protected List<AbsFilter> filters;
    protected boolean isRunning;

    public FilterGroup() {
        super("FilterGroup");
        filters=new ArrayList<AbsFilter>();
    }

    @Override
    public void init() {
        for (AbsFilter filter : filters) {
            filter.init();
        }
        isRunning=true;
    }

    @Override
    public void onPreDrawElements() {
    }

    @Override
    public void destroy() {
        destroyFrameBuffers();
        for (AbsFilter filter : filters) {
            filter.destroy();
        }
        isRunning=false;
    }

    @Override
    public void onDrawFrame(int textureId) {
        runPreDrawTasks();
        if (frameBuffers == null || frameBufferTextures == null) {
            return ;
        }
        int size = filters.size();
        int previousTexture = textureId;
        for (int i = 0; i < size; i++) {
            AbsFilter filter = filters.get(i);
            Log.d(TAG, "onDrawFrame: "+i+" / "+size +" "+filter.getClass().getSimpleName()+" "+
                    filter.surfaceWidth+" "+filter.surfaceHeight);
            if (i < size - 1) {
                GLES20.glViewport(0, 0, filter.surfaceWidth, filter.surfaceHeight);
                GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frameBuffers[i]);
                GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
                filter.onDrawFrame(previousTexture);
                GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
                previousTexture = frameBufferTextures[i];
            }else{
                GLES20.glViewport(0, 0 ,filter.surfaceWidth, filter.surfaceHeight);
                filter.onDrawFrame(previousTexture);
            }
        }
    }

    @Override
    public void onFilterChanged(int surfaceWidth, int surfaceHeight) {
        super.onFilterChanged(surfaceWidth, surfaceHeight);
        int size = filters.size();
        for (int i = 0; i < size; i++) {
            filters.get(i).onFilterChanged(surfaceWidth, surfaceHeight);
        }
        if(frameBuffers != null){
            destroyFrameBuffers();
        }
        if (frameBuffers == null) {
            frameBuffers = new int[size-1];
            frameBufferTextures = new int[size-1];

            for (int i = 0; i < size-1; i++) {
                GLES20.glGenFramebuffers(1, frameBuffers, i);

                GLES20.glGenTextures(1, frameBufferTextures, i);
                GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, frameBufferTextures[i]);
                GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA,
                        filters.get(i).surfaceWidth, filters.get(i).surfaceHeight, 0,
                        GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
                GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                        GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
                GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                        GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
                GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                        GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
                GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D,
                        GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);

                GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frameBuffers[i]);
                GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0,
                        GLES20.GL_TEXTURE_2D, frameBufferTextures[i], 0);

                GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
                GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
            }
        }
    }

    private void destroyFrameBuffers() {
        if (frameBufferTextures != null) {
            GLES20.glDeleteTextures(frameBufferTextures.length, frameBufferTextures, 0);
            frameBufferTextures = null;
        }
        if (frameBuffers != null) {
            GLES20.glDeleteFramebuffers(frameBuffers.length, frameBuffers, 0);
            frameBuffers = null;
        }
    }

    public void addFilter(final AbsFilter filter){
        if (filter==null) return;
        //If one filter is added multiple times,
        //it will execute the same times
        //BTW: Pay attention to the order of execution
        if (!isRunning){
            filters.add(filter);
        }
        else
            addPreDrawTask(new Runnable() {
            @Override
            public void run() {
                filter.init();
                filters.add(filter);
                onFilterChanged(surfaceWidth,surfaceHeight);
            }
        });
    }

    public void addFilterList(final List<AbsFilter> filterList){
        if (filterList==null) return;
        //If one filter is added multiple times,
        //it will execute the same times
        //BTW: Pay attention to the order of execution
        if (!isRunning){
            for(AbsFilter filter:filterList){
                filters.add(filter);
            }
        }
        else
            addPreDrawTask(new Runnable() {
                @Override
                public void run() {
                    for(AbsFilter filter:filterList){
                        filter.init();
                        filters.add(filter);
                    }
                    onFilterChanged(surfaceWidth,surfaceHeight);
                }
            });
    }
}

Original author: Martin, the original link: https://blog.csdn.net/Martin20150405/article/details/55520358

Guess you like

Origin www.cnblogs.com/hejunlin/p/12601006.html