OpenGL.Shader: Zhige teaches you to write a live filter client (1) project analysis

OpenGL.Shader: Zhige teaches you to write a live filter client (1) project analysis

0, talk a little off topic

2020 is definitely a magical year. I will welcome my least favorite Year of the Rat before the Australian fire is over. However, due to the pneumonia caused by the new coronavirus sweeping across the country, the Spring Festival holiday has been coveted by Chinese people. I can also summarize this series of articles, improve the corresponding engineering projects and open source them. This can be regarded as my own summary of the second decade of the 21st century; the star Kobe (my youth!) suddenly fell again Make me understand that you never know who will come first tomorrow or accident, so please cherish the present.

1. Project analysis

During the outbreak of the filter live broadcast a few years ago, I was fortunate to participate in some similar projects, read the open source project GPUImage and left a reading note , and then started to slowly get started with OpenGL/OpenCV. I have always wanted to make a representative open source project that will help me summarize and hope to help others.

I am glad that the project has been basically completed at this stage! Project features include: it is based on the Android terminal, combined with a variety of filter effects (mostly from GPUImage and  http://www.zealfilter.com/ ), the camera captures video, the microphone captures audio, and then encodes the output byte stream of the project. Most of them are written in C++ under the NDK, the security of the shader algorithm is improved, and no additional third-party libraries are used, which is convenient for novices to master the knowledge points, and it is also convenient for open source transformation and porting to iOS.

The engineering structure is roughly as follows:

After having a general understanding of the whole, the following analysis module composition:

CFEScheduler.java: The scheduling module of the Camera Filter Encoder. Put some business logic in this scheduling module to facilitate the standardization of business codes.

GpuFilterRender.java: Enter the Native entrance, take over the Surface life cycle, and related interfaces such as Filter Manage.

Camera / AudioRecord: System API to obtain video data (NV21 format) and audio data (PCM format).

Filter Manage About: A series of cpp files related to filter effects, which are the core part of filter rendering. The algorithm comes from GUImage and http://www.zealfilter.com

GL Render: The fully customized GL-Thread-Render module is the main part of filter rendering.

AMediaCodecEncoder: NDK's MediaCodec, I personally think it is a controversial/meaningful part, because many pits and details need to be paid attention to, but if you solve this part of the problem and thoroughly understand the content, you can master a lot of high-level knowledge, even Getting started with Android system development.

 

2、show me the code

Start with a simple java page, it's super simple. The interface layout is roughly as follows:

The logic code of CameraFilterEncoderActivity is also super simple (I don’t want to put it on)

public class CameraFilterEncodeActivity extends Activity {
    private static final String TAG = "CFEScheduler";
    private CFEScheduler cfeScheduler;

    @Override
    protected void onCreate(@Nullable Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_camera_filter_encode);
        SurfaceView surfaceView = findViewById(R.id.camera_view);
        // 初始化CFEScheduler,传入ctx和surface,用于渲染。
        if( cfeScheduler==null)
            cfeScheduler = new CFEScheduler(this, surfaceView);

        initFilterSpinner();
        initFilterAdjuster();
    }

    SeekBar filterAdjuster;
    private void initFilterAdjuster() {
        filterAdjuster = findViewById(R.id.value_seek_bar);
        filterAdjuster.setMax(100);
        filterAdjuster.setProgress(0);
        filterAdjuster.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
            @Override
            public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
                cfeScheduler.adjustFilterValue(progress, seekBar.getMax());
            }
            @Override
            public void onStartTrackingTouch(SeekBar seekBar) { }
            @Override
            public void onStopTrackingTouch(SeekBar seekBar) { }
        });
    }

    private void initFilterSpinner() {
        //从CFEScheduler获取当前所支持的滤镜清单
        String[] mItems = cfeScheduler.getSupportedFiltersName();
        //String[] mItems = this.getResources().getStringArray(R.array.filter_name);
        Spinner filterSpinner = findViewById(R.id.filter_spinner);
        ArrayAdapter<String> adapter=new ArrayAdapter<String>(this, android.R.layout.simple_spinner_dropdown_item, mItems);
        filterSpinner.setAdapter(adapter);
        filterSpinner.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() {
            @Override
            public void onItemSelected(AdapterView<?> parent, View view, int position, long id) {
                // 复原Filter.Adjuster
                filterAdjuster.setProgress(0);
                // 正常是通过getSupportedFilterTypeID(name)查询typeId
                // 这里的position更好是names[]的索引,所以可以直接通过index查询typeId了
                int typeId = cfeScheduler.getSupportedFilterTypeID(position);
                // 设置当前滤镜id,替换滤镜效果
                cfeScheduler.setFilterType(typeId);
            }
            @Override
            public void onNothingSelected(AdapterView<?> parent) { }
        });
    }

    @Override
    protected void onResume() {
        super.onResume();
        if( cfeScheduler!=null) {
            cfeScheduler.onResume();
        }
    }

    @Override
    protected void onPause() {
        super.onPause();
        if( cfeScheduler!=null) {
            cfeScheduler.onPause();
        }
    }

}

The meaning of the code is written with comments, which is easier to understand. Next, follow the vine and enter the CFEScheduler to see the implementation logic.

(PS: The video has been explained as the main line first, and the audio content will be added later.)

/**
 * Created by zzr on 2019/11/27.
 * CFE : Camera Filter Encode
 * 这些逻辑代码,如果不嫌弃混乱,可以直接写在CameraFilterEncodeActivity.
 */
public class CFEScheduler implements Camera.PreviewCallback, SurfaceHolder.Callback {
    private static final String TAG = "CFEScheduler";
    private WeakReference<Activity> mActivityWeakRef;
    private GpuFilterRender mGpuFilterRender;
    /*Camera SurfaceView相关*/
    CFEScheduler(Activity activity, SurfaceView view) {
        mActivityWeakRef = new WeakReference<>(activity);
        mGpuFilterRender = new GpuFilterRender(activity);

        view.getHolder().setFormat(PixelFormat.RGBA_8888);
        view.getHolder().addCallback(this);
    }
    public void onResume() {
        Log.d(TAG, "onResume ...");
        setUpCamera(mCurrentCameraId);
    }
    public void onPause() {
        Log.d(TAG, "onPause ...");
        releaseCamera();
    }

    // ... ... holder callback
    // ... ... (篇幅所限,分开显示)

    private int mCurrentCameraId = Camera.CameraInfo.CAMERA_FACING_FRONT;
    private Camera mCameraInstance;

    private void setUpCamera(final int id) {
        mCameraInstance = getCameraInstance(id);
        Camera.Parameters parameters = mCameraInstance.getParameters();
        // 设置frame data格式-NV21(默认格式)
        parameters.setPreviewFormat(ImageFormat.NV21);
        if (parameters.getSupportedFocusModes().contains(
                Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE)) {
            parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
        }
        // 这里,我没有设置best的previewsize。
        // adjust by getting supportedPreviewSizes and then choosing
        // the best one for screen size (best fill screen)
        mCameraInstance.setParameters(parameters);

        Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
        Camera.getCameraInfo(mCurrentCameraId, cameraInfo);
        Activity activity = mActivityWeakRef.get();
        int orientation = getCameraDisplayOrientation(activity, cameraInfo);
        Log.i(TAG, "getCameraDisplayOrientation : "+orientation);
        boolean flipHorizontal = cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT;
        //!!!根据需要设置是否水平翻转,垂直翻转
        mGpuFilterRender.setRotationCamera(orientation, flipHorizontal, false);
    }

    private int getCameraDisplayOrientation(final Activity activity,
                                            @NonNull Camera.CameraInfo info) {
        if(activity == null) return 0;
        int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
        int degrees = 0;
        switch (rotation) {
            case Surface.ROTATION_0:
                degrees = 0;
                break;
            case Surface.ROTATION_90:
                degrees = 90;
                break;
            case Surface.ROTATION_180:
                degrees = 180;
                break;
            case Surface.ROTATION_270:
                degrees = 270;
                break;
        }

        int result;
        if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
            result = (info.orientation + degrees) % 360;
        } else { // back-facing
            result = (info.orientation - degrees + 360) % 360;
        }
        return result;
    }
    /** A safe way to get an instance of the Camera object. */
    private Camera getCameraInstance(final int id) {
        Camera c = null;
        try {
            c = Camera.open(id);
        } catch (Exception e) {
            e.printStackTrace();
        }
        return c;
    }
    private void releaseCamera() {
        mCameraInstance.setPreviewCallback(null);
        mCameraInstance.release();
        mCameraInstance = null;
    }
}

By default, we turn on the front camera as the video data source. Here I did not set the previewsize of best, mainly because referring to the processing method of GPUImage, the problem of adjusting this aspect is handled inside the shader. Finally, set the rotation angle of the lens through GpuFilterRender.setRotationCamera, and whether it needs to be flipped horizontally and vertically. Next is the holder callback part.

    // ... holder callback
    private SurfaceTexture mCameraTexture = null;
    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        Log.d(TAG, "surfaceCreated ... ");
        try {
            int[] textures = new int[1];
            GLES20.glGenTextures(1, textures, 0);
            mCameraTexture = new SurfaceTexture(textures[0]);
            mCameraInstance.setPreviewTexture(mCameraTexture);
            mCameraInstance.setPreviewCallback(this);
            mCameraInstance.startPreview();
        } catch (Exception e) {
            e.printStackTrace();
        }
        mGpuFilterRender.onSurfaceCreate(holder.getSurface());
    }
    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
        Log.d(TAG, "surfaceChanged ... ");
        mGpuFilterRender.onSurfaceChange(holder.getSurface(), width, height);
    }
    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        Log.d(TAG, "surfaceDestroyed ... ");
        mGpuFilterRender.onSurfaceDestroy(holder.getSurface());
        if( mCameraTexture!=null){
            mCameraTexture.release();
        }
    }

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
        if( mGpuFilterRender!=null){
            final Camera.Size previewSize = camera.getParameters().getPreviewSize();
            mGpuFilterRender.feedVideoData(data.clone(), previewSize.width, previewSize.height);
        }
    }

Among the three callback interfaces of surfaceview, we also bind it to GpuFilterRender to facilitate life cycle control. And use SurfaceTexture to set the preview of the Camera. In the callback interface of the preview data, the clone of the data is passed into the GpuFilterRender to cache it.

Then look at GpuFilterRender.java, and the corresponding JNI file interface.

/**
 * Created by zzr on 2019/11/27.
 */
class GpuFilterRender {
    static {
        System.loadLibrary("gpu-filter");
    }

    private Context ctx;
    GpuFilterRender(Context context) {
        ctx = context;
    }

    public native void onSurfaceCreate(Surface surface);

    public native void onSurfaceChange(Surface surface, int width, int height);

    public native void onSurfaceDestroy(Surface surface);

    // 发送视频nv21数据
    public native void feedVideoData(byte[] data,int width,int height);
    // 发送音频 pcm数据
    public native void feedAudioData(byte[] data);
    /**
     * 设置摄像头角度和方向
     * @param rotation 角度
     * @param flipHorizontal 是否水平翻转
     * @param flipVertical 是否垂直翻转
     */
    public native void setRotationCamera(final int rotation, final boolean flipHorizontal,
                                         final boolean flipVertical);
    // 设置滤镜类型
    public native void setFilterType(int typeId);
    // 调整滤镜效果
    public native void adjustFilterValue(int value,int max);
}
#include <jni.h>
#include <android/native_window.h>
#include <android/native_window_jni.h>
#include "../egl/GLThread.h"
#include "render/GpuFilterRender.h"

GLThread* glThread = NULL;
GpuFilterRender* render = NULL;

extern "C" {

JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_onSurfaceCreate(JNIEnv *env, jobject instance, jobject surface) {
    ANativeWindow *nativeWindow = ANativeWindow_fromSurface(env, surface);
    if (render == NULL) {
        render = new GpuFilterRender();
    }
    if (glThread == NULL) {
        glThread = new GLThread();
    }
    glThread->setGLRender(render);
    glThread->onSurfaceCreate(nativeWindow);
}
JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_onSurfaceChange(JNIEnv *env, jobject instance, jobject surface,
                                                           jint width, jint height) {
    glThread->onSurfaceChange(width, height);
}
JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_onSurfaceDestroy(JNIEnv *env, jobject instance, jobject surface) {
    glThread->onSurfaceDestroy();
    glThread->release();
    delete glThread;
    glThread = NULL;

    if (render == NULL) {
        delete render;
        render = NULL;
    }
}
JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_setRotationCamera(JNIEnv *env, jobject instance,
                                                             jint rotation, jboolean flipHorizontal,
                                                             jboolean flipVertical) {
    // 注意这里flipVertical对应render->setRotationCamera.flipHorizontal
    // 注意这里flipHorizontal对应render->setRotationCamera.flipVertical
    // 因为Android的预览帧数据是横着的,仿照GPUImage的处理方式。
    if (render == NULL) {
        render = new GpuFilterRender();
    }
    render->setRotationCamera(rotation, flipVertical, flipHorizontal);
}
JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_setFilterType(JNIEnv *env, jobject instance, jint typeId) {
    if (render == NULL)
        render = new GpuFilterRender();
    render->setFilter(typeId);
}
JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_adjustFilterValue(JNIEnv *env, jobject instance, jint value, jint max) {
    if (render == NULL)
        render = new GpuFilterRender();
    render->adjustFilterValue(value, max);
}
JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_feedVideoData(JNIEnv *env, jobject instance,
                                                         jbyteArray array, jint width, jint height) {
    if (render == NULL) return;
    jbyte *nv21_buffer = env->GetByteArrayElements(array, NULL);
    jsize array_len = env->GetArrayLength(array);
    render->feedVideoData(nv21_buffer, array_len, width, height);
    env->ReleaseByteArrayElements(array, nv21_buffer, 0);
}
} // extern "C"

(Please refer to the previous article https://blog.csdn.net/a360940265a/article/details/88600962 for the GLThread part )

The next chapter enters GpuFilterRender.cpp, in-depth analysis of the precautions of Java_org_zzrblog_gpufilter_GpuFilterRender_setRotationCamera, and the core part-the content of Filter.

Guess you like

Origin blog.csdn.net/a360940265a/article/details/104203321