Use NDK MediaCodec to hard decode to texture on Android

I wrote about the ability to hard-decode to textures through MediaCodec on the Android platform, and then use textures for rendering on Unity. Use Android's MediaCodec hard decoding capability to load textures in Unity The stream is rendered to a surface, how to use the texture in the video stream in Unity? One way is that after the MediaCodec decoding is completed, the image cache can be read out, and loaded into a texture using the LoadRawTextureData capability of Texture2D in Unity, and then converted into RGB format through YUV format. This method consumes a lot of performance and requires a chain from GPU->CPU->GPU, which is very expensive https://blog.csdn.net/grace_yi/article/details/116497338?spm=1001.2014.3001.5501

I used the JAVA interface before, but recently I am writing the native SDK, and I need to use the C++ interface directly for decoding and texture processing. The most important thing is how to write the decoded picture directly to the texture?

We know that Android NDK provides the decoding interface of MediaCodec C++. It is very similar to the interface of java. You can refer to the java interface to write. https://developer.android.com/ndk/reference/group/media#amediacodec_configure icon-default.png?t=M4ADhttps://developer.android.com/ndk/reference/group/media#amediacodec_configure

 

At present, the most important thing is the configuration interface of AMediaCodec_configure, which is configured to render on the surface. How is this surface created?

The surface is still easy to create, but how can we pass in our own textureId? I found a lot of information on this piece, but I didn't directly find the C++ interface. Then I took a trick and called the java interface in C++ to operate.

1. First, when we use JNI_OnLoad to load, save our global jvm. Then use jvm to get JNIEnv, the code is as follows.

JavaVM *global_jvm;
jobject g_surfaceTex_obj;
JNIEXPORT jint JNICALL JNI_OnLoad(JavaVM* vm, void* reserved) {
    signal(SIGSEGV, SIG_DFL);
    global_jvm = vm;
    return JNI_VERSION_1_6;
}
JNIEnv *get_env() {
    if (global_jvm == NULL) return NULL;
    JNIEnv *jni_env = NULL;
    int status = global_jvm->GetEnv((void **)&jni_env, JNI_VERSION_1_6);
    return jni_env;
}

2. With JNIEnv, we can call java interface to create SurfaceTexture, Surface, call UpdateTexImage interface, etc.

        int getTextureID() {
        GLuint texture;
        GLint currentTexture = 0;
        GL(glGetIntegerv(GL_TEXTURE_BINDING_EXTERNAL_OES, &currentTexture))
        GL(glGenTextures(1, &texture));
        GL(glBindTexture(GL_TEXTURE_EXTERNAL_OES, texture));
        GL(glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MAG_FILTER, GL_LINEAR));
        GL(glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_MIN_FILTER, GL_LINEAR));
        GL(glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE));
        GL(glTexParameteri(GL_TEXTURE_EXTERNAL_OES, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE));
        GL(glBindTexture(GL_TEXTURE_EXTERNAL_OES, currentTexture))
        return texture;
    }

void createSureface(){
    JNIEnv *env = get_env();
    jclass surfacetexture_class = env->FindClass("android/graphics/SurfaceTexture");
    jmethodID surfacetexture_init = env->GetMethodID(surfacetexture_class, "<init>", "(I)V");
    jmethodID surfacetexture_setwh = env->GetMethodID(surfacetexture_class, "setDefaultBufferSize", "(II)V");
    jobject surfacetexture_obj = env->NewObject(surfacetexture_class, surfacetexture_init, (jint)getTextureID());
    env->CallVoidMethod(surfacetexture_obj, surfacetexture_setwh, width, height);
    jclass surface_class = env->FindClass("android/view/Surface");
    jmethodID surface_init = env->GetMethodID(surface_class, "<init>", "(Landroid/graphics/SurfaceTexture;)V");
    jobject surface_obj = env->NewObject(surface_class, surface_init, surfacetexture_obj);
    g_surfaceTex_obj = env->NewGlobalRef(surfacetexture_obj);

    env->DeleteLocalRef(surface_obj);
    env->DeleteLocalRef(surfacetexture_obj);
}

3. After creating the Surface, when calling the decoded interface AMediaCodec_configure, know the nativeWindow

ANativeWindow* nativeWindow = ANativeWindow_fromSurface(env, surface);
AMediaCodec_configure(videoCodec, videoFormat, nativeWindow, NULL, 0);

4. Call UpdateTexImage to update each frame of data

    JNIEnv *envT = get_env();
    jclass surfacetexture_class = envT->FindClass("android/graphics/SurfaceTexture");
    jmethodID surfacetexture_update = envT->GetMethodID(surfacetexture_class, "updateTexImage", "()V");
    envT->CallVoidMethod(g_surfaceTex_obj, surfacetexture_update);

that's it. In this way, the decoded data is directly stored on the Texture in the GPU, and then we can render it with the TextureId. This is how to use the C++ version we provide. To use the JAVA version, click the link at the beginning.

references:

omaf/NVRSurfaceTexture.cpp at master · nokiatech/omaf · GitHubicon-default.png?t=M4ADhttps://github.com/nokiatech/omaf/blob/master/Player/Sources/Player/VideoDecoder/Android/NVRSurfaceTexture.cpp

https://developer.android.com/ndk/reference/group/mediaicon-default.png?t=M4ADhttps://developer.android.com/ndk/reference/group/media 

Guess you like

Origin blog.csdn.net/grace_yi/article/details/125332861