Android recording video - nv21 to nv12 pose (used by libyuv)

Head up picture

foreword

Recently, when I came into contact with the video recording project in the project, the testers raised a bug: the video recorded by the project would be intermittent, stop every once in a while, and then continue to play. The current project is to use video and audio tracks to record, and then use ffmpeg to synthesize the video. After going through the code logic, it was found that the reason for the bug was the lack of video frames. The defined number of synthetic frames and recording frames was 24. However, in the process of yuv data conversion, it took a long time, resulting in frame loss. Because all the conversions in the current project are converted through java methods. So, now we need to use google's libyuv for format conversion to improve the speed.

Why is conversion needed?

Then some clever Xiaoming will ask, wouldn’t it be good to just get the yuv data and record it? Why do we need to convert it?
Of course not, transformation is necessary. Because nv21 is the data returned by the Android camera. We generate H.264 files through MediaCodec. To use MediaCodec, you need to create and set the MediaFormat object, and MediaFormat uses COLOR_FormatYUV420SemiPlanar, which is the NV12 mode, so you have to do a conversion to convert NV21 to NV12.

Specific use code:

        MediaFormat mediaFormat;
        if (rotation == 90 || rotation == 270) {
    
    
            //设置视频宽高
            mediaFormat = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, videoHeight, videoWidth);
        } else {
    
    
            mediaFormat = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, videoWidth, videoHeight);
        }
        //图像数据格式 YUV420(nv12)
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, 
        MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
        //码率
        Log.d("输出视频", "码率" + videoWidth * videoHeight);
        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, videoWidth * videoHeight);
        //每秒30帧
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, video_frameRate);
        //1秒一个关键帧
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
        videoMediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
        videoMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        videoMediaCodec.start();

For the difference between formats and the understanding of Android MediaCodec, you can jump to watch, this article will not describe:
Audio and video basics - pixel format YUV
Android audio and video - YUV format in-depth explanation of
Android MediaCodec hard-coded H264 files

Introduction to libyuv

libyuv is Google's open source library that implements conversion, rotation, and scaling between various YUV and RGB. It is cross-platform and can be compiled and run on Windows, Linux, Mac, Android and other operating systems, x86, x64 and arm architectures, and supports SIMD instruction acceleration such as SSE, AVX and NEON.

actual use

For Android, we cannot use libyuv directly, we need to integrate it into the project or compile the so dynamic library to use it.

1. Integration

For convenience, we directly integrate the modules in the implemented projects.

LibyuvDemo
step_01.png

This brother has integrated libyuv and made it into a module. We only need to download the source code, and then add the module to our own project to use it directly! ! ! But it should be noted that the compressYUV method of YuvUtil in this project is wrong, and it needs to be modified later (will be mentioned later)

After we have downloaded the source code, add the module through AndroidStudio in our project.
step_02.png
Add the source code just downloaded, the libyuv path inside, finish directly, and leave the rest to AndroidStudio
step_03.png

After adding, there is one more module in the project, and here, the integration is basically completed.
step_04.png

2. Modify the code

As I said in the previous integration, the compressYUV method is wrong, we need to modify it, and directly replace the code in the compressYUV method with the following code and it will be a bird~~~

extern "C"
JNIEXPORT void JNICALL
Java_com_libyuv_util_YuvUtil_compressYUV(JNIEnv *env, jclass type,
                                         jbyteArray nv21Src, jint width,
                                         jint height, jbyteArray i420Dst,
                                         jint dst_width, jint dst_height,
                                         jint mode, jint degree,
                                         jboolean isMirror) {
    
    

    jbyte *src_nv21_data = env->GetByteArrayElements(nv21Src, NULL);
    jbyte *dst_i420_data = env->GetByteArrayElements(i420Dst, NULL);
    jbyte *tmp_dst_i420_data = NULL;

    // nv21转化为i420
    jbyte *i420_data = (jbyte *) malloc(sizeof(jbyte) * width * height * 3 / 2);
    nv21ToI420(src_nv21_data, width, height, i420_data);
    tmp_dst_i420_data = i420_data;

    // 镜像
    jbyte *i420_mirror_data = NULL;
    if(isMirror){
    
    
        i420_mirror_data = (jbyte *)malloc(sizeof(jbyte) * width * height * 3 / 2);
        mirrorI420(tmp_dst_i420_data, width, height, i420_mirror_data);
        tmp_dst_i420_data = i420_mirror_data;
    }

    // 缩放
    jbyte *i420_scale_data = NULL;
    if(width != dst_width || height != dst_height){
    
    
        i420_scale_data = (jbyte *)malloc(sizeof(jbyte) * width * height * 3 / 2);
        scaleI420(tmp_dst_i420_data, width, height, i420_scale_data, dst_width, dst_height, mode);
        tmp_dst_i420_data = i420_scale_data;
        width = dst_width;
        height = dst_height;
    }

    // 旋转
    jbyte *i420_rotate_data = NULL;
    if (degree == libyuv::kRotate90 || degree == libyuv::kRotate180 || degree == libyuv::kRotate270){
    
    
        i420_rotate_data = (jbyte *)malloc(sizeof(jbyte) * width * height * 3 / 2);
        rotateI420(tmp_dst_i420_data, width, height, i420_rotate_data, degree);
        tmp_dst_i420_data = i420_rotate_data;
    }

    // 同步数据
    // memcpy(dst_i420_data, tmp_dst_i420_data, sizeof(jbyte) * width * height * 3 / 2);
    jint len = env->GetArrayLength(i420Dst);
    memcpy(dst_i420_data, tmp_dst_i420_data, len);
    tmp_dst_i420_data = NULL;
    env->ReleaseByteArrayElements(i420Dst, dst_i420_data, 0);

    // 释放
    if(i420_data != NULL) free(i420_data);
    if(i420_mirror_data != NULL) free(i420_mirror_data);
    if(i420_scale_data != NULL) free(i420_scale_data);
    if(i420_rotate_data != NULL) free(i420_rotate_data);
}

This method will compress, rotate and return the data in yuvI420 format for the nv21 data we passed in, which will play an important role in our next step of converting nv12. The parameter description definition method YuvUtil is very clear.
step_05.png

3. Add method

In our integrated libyuv, there is no method of converting I420 to nv12. This method needs to be done by ourselves. But we haven't written C code either. And we are not familiar with the mutual call between java and C. then what should we do? As the saying goes, a bad pen is worse than a good memory. I'll tell you otherwise, just find out for yourself. So we can only find out first. Android's JNI development comprehensive introduction and best practice

The article is very long, we can just go through it briefly, and friends who are interested can deliberate in detail. We just need to know that we need to define a conversion method in YuvUtils. Then use the native keyword, which is what java calls C. Then add the corresponding method in the C code, which is the YuvJni.cpp file (note that the method name needs to specify the specific location, just follow the name of the compressYUV method in the previous section)

Then we searched the C-side code of I420 to nv12 on Baidu, and then I found a piece of conversion code in SharryChoo/LibyuvSample as follows:

void LibyuvUtil::I420ToNV21(jbyte *src, jbyte *dst, int width, int height) {
    
    
    jint src_y_size = width * height;
    jint src_u_size = src_y_size >> 2;
    jbyte *src_y = src;
    jbyte *src_u = src + src_y_size;
    jbyte *src_v = src + src_y_size + src_u_size;

    jint dst_y_size = width * height;
    jbyte *dst_y = dst;
    jbyte *dst_vu = dst + dst_y_size;

    libyuv::I420ToNV21(
            (uint8_t *) src_y, width,
            (uint8_t *) src_u, width >> 1,
            (uint8_t *) src_v, width >> 1,
            (uint8_t *) dst_y, width,
            (uint8_t *) dst_vu, width,
            width, height
    );
}

But we still need to modify the definition type of the method before we can use it (like the compressYUV method). The modified code is posted below:


// i420 --> nv12
extern "C"
JNIEXPORT void JNICALL
Java_com_libyuv_util_YuvUtil_yuvI420ToNV12(JNIEnv *env, jclass type, jbyteArray i420src,
                                           jbyteArray nv12Dst,
                                           jint width, jint height) {
    
    


    jbyte *src_i420_data = env->GetByteArrayElements(i420src, NULL);
    jbyte *src_nv12_data = env->GetByteArrayElements(nv12Dst, NULL);

    jint src_y_size = width * height;
    jint src_u_size = src_y_size >> 2;

    jbyte *src_y = src_i420_data;
    jbyte *src_u = src_i420_data + src_y_size;
    jbyte *src_v = src_i420_data + src_y_size + src_u_size;

    jint dst_y_size = width * height;
    jbyte *dst_y = src_nv12_data;
    jbyte *dst_uv = src_nv12_data + dst_y_size;

    libyuv::I420ToNV12(
            (uint8_t *) src_y, width,
            (uint8_t *) src_u, width >> 1,
            (uint8_t *) src_v, width >> 1,
            (uint8_t *) dst_y, width,
            (uint8_t *) dst_uv, width,
            width, height
    );
}

We copy the above code into the YuvJni.cpp file, and then define a method in YuvUtil:

  /**
     * 将I420转化为NV2
     *
     * @param i420Src 原始I420数据
     * @param nv12Dst 转化后的NV12数据
     * @param width   输出的宽
     * @param height  输出的高
     */
    public static native void yuvI420ToNV12(byte[] i420Src,byte[] nv12Dst,int width,int height);

At this point, our I420 to nv12 method has been added. Of course, if other methods are needed, we can google to convert the code, and add it in by analogy. So back to the topic, what is the posture of nv21 to nv12?

4. Used in the project

Because what we added is a module, then we must add the module in the project's build.gradle to call the things in libyuv.

dependencies {
    
    

    ···
    api project(":leo-libyuv")
    ···
}

Then, where I need to use the conversion, I add the key two lines of code.


    /**
     * 编码视频
     * @param nv21 nv21 byte数组
     * @throws IOException 抛出IO异常
     */
    private void encodeVideo(byte[] nv21) throws IOException {
    
    
 
        //定义i420 byte数组
        byte[] yuvI420 = new byte[nv21.length];
        //定义我们需要的nv12数组
        byte[] nv12Result = new byte[nv21.length];
        //初始化(多次使用造成 anr,弃用)
        //YuvUtil.init(videoWidth, videoHeight, videoWidth, videoHeight);
        //将nv21转换成I420
        YuvUtil.yuvCompress(nv21, videoWidth, videoHeight, yuvI420, videoWidth, videoHeight, 0, rotation, isFrontCamera);
        //将I420转换成nv12
        YuvUtil.yuvI420ToNV12(yuvI420, nv12Result, videoWidth, videoHeight);

        //得到编码器的输入和输出流, 输入流写入源数据 输出流读取编码后的数据
        //得到要使用的缓存序列角标
        int inputIndex = videoMediaCodec.dequeueInputBuffer(TIMEOUT_USEC);
        if (inputIndex >= 0) {
    
    
            ByteBuffer inputBuffer = videoMediaCodec.getInputBuffer(inputIndex);
            inputBuffer.clear();
            //把要编码的数据添加进去
            inputBuffer.put(nv12Result);
            //塞到编码序列中, 等待MediaCodec编码
            videoMediaCodec.queueInputBuffer(inputIndex, 0, nv12Result.length, System.nanoTime() / 1000, 0);
        }

        ······编码h.264等等一系列操作。
   }

At this point, all the postures of our nv21 to nv12 have been completed.

Summarize

There are three steps in total:

1. Inherit libyuv into our project.
2. Modify the method in libyuv and add the corresponding I420 to nv12 method.
3. Add the module to our own project to call the corresponding method.

Of course, this posture is a relatively opportunistic one. If you have time and energy, you still have to think about libyuv, jni development, yuv and other related knowledge.

Relevant reference materials:
"Android audio and video - Libyuv use actual combat"
"Audio and video basics - pixel format YUV" "
Android audio and video - YUV format in simple terms"
"Android MediaCodec hard-coded H264 files"
"Android JNI development comprehensive introduction and latest Best Practices"
"LibyuvDemo"
"SharryChoo/LibyuvSample"

Guess you like

Origin blog.csdn.net/weixin_43683367/article/details/127431384